Sketchup 2022 software application in graphic design p13

Automotive design is a specialized discipline in which designers are challenged to create emotionally appealing designs. From a practice perspective, this requires that designers apply their hermeneutic as well as reflective design thinking skills. However, due to the increasing demand for new car models, it is not always possible to keep generating new car designs without some form of assistive means. Therefore, it is common practice to use Automated Morphing Systems (AMS) to facilitate and accelerate the design process in the automotive industry. However, AMS, which is an efficient algorithmic driven tool for form generation, lacks the emotional knowledge of human beings, as well as the ability to introduce a “creative” and preferably a “winning” design. The purpose of this research is to study designers' reasoning about product (automotive) form, their form generation activity, and the implications of these. The research objective is to understand how designers generate forms driven by their implicit values, beliefs and attitudes towards designing, and how these are supported by their visualization and representation skills. Four research questions have been formulated in order to get a firm answer posed in this research. Generation of measurable and testable data – which involved both qualitative and quantitative research to gather and analyze implicit and explicit designer’s knowledge – constituted the main empirical effort for this thesis. A design research methodology framework consisting of three different parts was used in this data gathering exercise. These parts are: descriptive study I, prescriptive study, and descriptive study II. They involved methods such as surveys, observation studies and evaluation studies. Master’s students’ evaluations as well as the designers’ own interpretations of their sketches – which represent the sequence of morphed forms – were considered essential aspects of the empirical studies. The findings of this study can be summarized as follows: 1. Approaches in form development among designers vary due to their experiences, which affect their sketching abilities, activities, and implicit thinking patterns. In their sketching and form development activities, designers emphasize the most informative views, such as façade and three quarter front views, compared to other views of the car. Rather than adopt a uniform transformation strategy which includes the entire car, they also select what elements to morph. 2. In manual form generation, designers contribute with their personal and creative input in the development of the forms of the overall car, its selected items, and regions that determine the overall character of the car. Major differences in the morphing approaches applied by designers and automated CAD systems reside in the recognition and interpretation of the meaning of form elements. 3. Considering the inability of AMS to morph selectively and inconsistently, as well as to introduce ambiguity and variance, it is suggested here that AMS may be useful only for convergent transformation, which typically occurs during the later stages of the styling process. 4. Although perceptions vary according to how representations are presented in the morphing process, the Perceptual Product Experience (PPE) framework can still be considered a useful tool for establishing familiarity, for understanding quality characteristics and the nature of the product, and, finally, for determining meanings and assessing the values of form elements. In conclusion, the work presents a descriptive model for practice-based design thinking about form development in automotive design. Manual interpolative morphing has been the focal area of study. The study categorizes meaning with respect to designer perception. Based on the study of manual morphing exercises, a new methodology of analyzing form syntactics, pragmatics and semantics related to design thinking, form development, and automotive design has been developed.

Ezer Han - Metaverse + Digital Architect, Designer, 3D Modeller

Published on May 3, 2022

My name is Ezer Han, I am a chartered Architect with strong and experienced skillsets in 3D modelling, designing and conceptual art. I am excited to p...

Ezer Han

Srishti Palani, Autodesk Research, Canada and Design Lab, University of California, United States,

With the rapid development of creativity support tools, creative practitioners (e.g., designers, artists, architects) have to constantly explore and adopt new tools into their practice. While HCI research has focused on developing novel creativity support tools, little is known about creative practitioner's values when exploring and adopting these tools. We collect and analyze 23 videos, 13 interviews, and 105 survey responses of creative practitioners reflecting on their values to derive a value framework. We find that practitioners value the tools’ functionality, integration into their current workflow, performance, user interface and experience, learning support, costs and emotional connection, in that order. They largely discover tools through personal recommendations. To help unify and encourage reflection from the wider community of CST stakeholders (e.g., systems creators, researchers, marketers, educators), we situate the framework within existing research on systems, creativity support tools and technology adoption.

Keywords: Creativity Support Tools, Creative Practitioners, Tool Adoption

ACM Reference Format:
Srishti Palani, David Ledo, George Fitzmaurice, and Fraser Anderson. 2022. ”I don't want to feel like I'm working in a 1960s factory”: The Practitioner Perspective on Creativity Support Tool Adoption. In CHI Conference on Human Factors in Computing Systems (CHI '22), April 29-May 5, 2022, New Orleans, LA, USA. ACM, New York, NY, USA 18 Pages. https://doi.org/10.1145/3491102.3501933

Sketchup 2022 software application in graphic design p13
Figure 1: Visual abstract of practitioners’ values when adopting creativity support tools, showing 3 contributions: C1. Empirical Observations, C2. Creative Practitioners’ Value Framework, C3. Mapping values to design principles and theories in literature.

1 INTRODUCTION

Creative practitioners (e.g. professional designers, software developers, artists, architects, film-makers, etc.) harness digital technology to achieve their goals and augment their creative potential. This use of digital technologies to support creative practice – Creativity Support Tools (CSTs, e.g. AutoCAD and Illustrator), have been studied for decades and is considered a ”grand challenge” in HCI research [27, 71]. Frich et al. [27] define a CST as ”technology that runs on one or more digital systems, encompasses one or more creativity-focused features, and is employed to positively influence users of varying expertise in one or more distinct phases of the creative process.”

Over the past couple of decades, creative domains, i.e., industries that conceive products and services [18, 61], such as design, software development, architecture, and film, and entertainment, have grown both in industry and research [27, 57, 71]. In this rapidly evolving landscape, it has become imperative for creative practitioners to constantly explore CSTs, and decide whether to adopt new tools or abandon current ones. However, little is known about creative practitioners’ values when choosing and exploring tools.

HCI research has developed several novel tools to stimulate creative thinking and support design processes (e.g., [20, 26, 27, 28, 37]). However, most of these prototype CSTs exist in a lab setting – few explorations are carried out for tools in-the-wild, over a long period of time [27, 57, 59, 70]. To address this issue, Frich et al. [27] suggest “shifting our efforts to studying in-vivo use of creativity support tools, not just the ones we build ourselves, but the ones that most creative practitioners employ in practice”. This premise motivates our research questions:

RQ1: What do creative practitioners value when adopting CSTs?

RQ2: How do creative practitioners discover and explore new CSTs?

To address these questions, we analyzed 13 interviews and 23 YouTube videos of creative practitioners reflecting on their values when adopting CSTs. We synthesize the findings from this analysis in a conceptual framework of values held by creative practitioners when deciding whether to adopt a new CST. Then, to contextualize and verify identified trends in values with a larger population of creative practitioners, we surveyed 105 creative practitioners and asked them to rate and rank each of the values in the framework.

This investigation uncovers that creative practitioners care about multiple factors: CST's features and functionality, integration with existing workflow, performance, interface and user experience, support, financial cost, and even the emotional connection with the tool. Delving into the subcategories, the highest-rated values were a CSTs’ reliability in performance and ease of use. This paper makes the following contributions (Figure 1):

C1. Empirical observations from creative practitioners [§4]. The analysis of YouTube videos, practitioner interviews and survey responses, integrates the perspective of creative practitioners to existing CST developer, educator or researcher-centric perspectives described in literature.

C2. Creative Practitioners’ Value Framework [§4]. A conceptual framework of creative practitioners’ values for discovery and adoption of CSTs as shaped by C1.

C3. Unified mapping of practitioners’ values to design principles and theories in literature [§5]. We connect our proposed framework to principles in existing literature to encourage reflection and innovation from CST stakeholders (e.g. systems creators, researchers, marketers, educators).

To contextualize these contributions, we describe existing design heuristics in HCI systems, CST research, and theories of tool adoption [§2]. The research methods [§3] detail how we collected and analyzed video, interview data, and survey data. The definitions of our framework, along with the empirical observation and numerical data are outlined [§4] before they can be tied back together to the foundational literature [§5]. We conclude by discussing the limitations and the avenues for future work [§6].

To frame practitioners’ values, one must consider elements of both technology, as well as usage preferences and practitioner needs. This paper builds on HCI systems and CST design and evaluation; and social science theories of technology acceptance and adoption.

2.1 Designing and Evaluating Creativity Support Tools

As a sub-field of HCI research, studies of CSTs formally began two decades years ago, when Shneiderman alluded to computers’ potential to become tools that enhance human creativity [69, 70]. CST research has developed tools for many stages, such as making discoveries or inventions from information gathering [43, 54], hypothesis and idea generation [72], and initial production [20, 25], to refinement [37], validation [26], and dissemination [27, 70].

Note how the term ”tool” is tied to the Human-Centered perspective on what is used to accomplish a task, ranging from applications (e.g., Figma), toolkits (e.g., D3 to visualize data), and programming languages (e.g., C#), as opposed to individual commands (e.g., undo, copy) or a tool's features (e.g., using a brush inside an application).

The HCI and creativity research communities have proposed quantitative and qualitative approaches to evaluate the usefulness of CSTs. One quantitative measure is the Creativity Support Index [14, 17], a general-purpose survey to gauge a novel CST's effectiveness. Other methods include co-design workshops [22], physiological responses (e.g., galvanic skin responses, EEG) [15], and self-report in post-study reflective think-alouds and surveys[65, 79].

CST research also follows design principles proposed by HCI systems research. Myers outlines that systems should facilitate: (i) Path of Least Resistance (i.e., leading users towards doing the right things, and away from doing the wrong things) and (ii) Predictability (i.e., alignment with the user's mental model), (iii) ”Low Thresholds, High Ceilings, and Wide Walls” (i.e. that tools should be easy for novices to get started, yet provide ambitious functionality that experts need and provide a wide range of functionality with underlying services). Olsen [51] outlines similar concepts: (i) Generality (i.e., the ability for a tool to generalize across situations, tasks and users), (ii) Reduce solution viscosity (i.e. reducing the effort required to iterate on many possible solutions), (iii) Enabling Expressive Leverage such that a designer can accomplish more by expressing less, (iv) Facilitating Expressive Match (i.e., mapping how close the means for expressing design choices are to the problem being solved), (v) Power in combination (i.e., supporting combinations of more basic building blocks through: (a) Inductive combination (i.e., combining features within one tool to accomplish larger, more complex goals), or (b) Simplifying Interconnection (i.e., all components/features of the tool should work with each other within and across other tools). Similarly, Cognitive Dimensions of Notation [8, 30] was also used to reflect on systems, though its usage in the literature has decreased in favour of Olsen's framework likely given the high overlap [38].

Similar design principles and heuristics are outlined in CST research as well. For example, Resnick et al. [59] echo Myer's [46] design principle of ”low thresholds, high ceilings and wide walls”. Resnick et al. also proposed additional principles: (ii) support many paths and many styles, (iii) support collaboration, (iv) support open interchange, (v) make it as simple as possible, (vi) choose black boxes of explorability carefully. Additional perspectives informed by developers and HCI researchers include: (vii) invent things you would want to use yourself, (viii) balance user suggestions with observation and participatory process, (ix) iterate (x) design for designers, and (xi) evaluate your tools. Shneiderman [70] frames general design recommendations for CSTs: (i) support exploratory search, (ii) enable collaboration, (iii) provide rich history-keeping, and (iv) design with low thresholds, high ceilings and wide walls. The above systems and CST research papers acknowledge the importance of establishing frameworks that foster reflection systems’ usefulness and contributions to the research- and user-communities.

Recent surveys of CST and HCI systems research show a focus on building novel tools often evaluated in controlled experiments with novices and students as primary subjects[38, 57]. This might be due to research prototypes’ limited resources to operate at scale. This constraints the understanding of in-the-wild use of CSTs over a long-period of time by practitioners. Still, there is room to better understand long-term tool use within people's existing practices and use these findings to better inform system building in HCI.

In practice, creative professionals usually opt for CSTs made by established industry tech companies, for example digital designers use Adobe Illustrator or InDesign, programmers use Microsoft Visual Studio [75]. This paper builds on and unifies these multi-disciplinary reflections and sheds light on long-term perspectives when exploring, adopting, retaining, and abandoning CSTs.

2.2 Theoretical Background On Technology Adoption

Research in social sciences has explored the theory for what influences individuals’ acceptance and adoption of emerging technologies in education, healthcare, and other information provisions.

Rogers [60] defines technology adoption as “a decision to make full use of an innovation as the best course of action” (p473). The adoption process includes an individual's acceptance or rejection of the innovation, its subsequent use, and purchasing and acquisition decisions [58]. Rogers Innovation Diffusion Theory [60] posits a five stage process for technology adoption – the innovation-decision process: (i) Knowledge, occurs when an individual learns about an innovation; (ii) Persuasion, involves the individual forming an opinion on the innovation; (iii) Decision, occurs when the individual prepares to choose to adopt (or reject) an innovation; (iv) Implementation, is when the individual uses the innovation, and (v) Confirmation, is when the individual reinforces the decision to adopt or reject the innovation. Rogers’ Innovation Diffusion Theory proposes that users base technology adoption decision on perceptions of the tool's: (i) relative advantage (the extent to which a new technology is seen as being beneficial over the preceding one – similar to performance expectancy), (ii) complexity (the difficulty in using it – similar to effort expectancy), (iii) compatibility (the extent to which using the target technology is viewed as being compatible with the user's beliefs, values, and work patterns), (iv) trial-ability (the possibility to try, experiment, and reduce uncertainty and to learn by doing prior to adopting), and (v) observability (the visibility of the results of adoption, which stimulates discussion, interest, and uptake). Other theories exist [34, 41, 66, 68, 82, 83], yet they have received criticism for excluding external conditions [23, 73, 74, 83].

Parallel research on Technology Acceptance has also been developed: including the Theory of Reasoned Action [64], Theory Of Planned Behaviour [4], Technology Acceptance Model and TAM2 [40] and the Unified Theory Of Acceptance And Use Of Technology by Venkatesh et al. (UTAUT) [77]. These models predict that technology acceptance is influenced by: (i) performance expectancy / perceived usefulness (the extent to which potential users expect performance improvements using the new technology); (ii) effort expectancy / ease of use (the extent to which people expect usage to be free of effort); and (iii) social influence / subjective norms (perceived pressure from others to use the technology). These theories focus on predicting acceptance instead of actual use and adoption of technology. While the terms ”adoption” and ”acceptance” are often used interchangeably, they actually refer to two distinct aspects. Acceptance is viewed as a component of adoption [58], such as the willingness to use technology for the tasks it was designed to support [21]. Willingness and actual use are separate and different measures. This paper unifies the vocabulary used to describe the CST design principles and theoretical model parameters, and adds a layer of granularity and richness to existing models by presenting empirical observations from practitioners.

3 METHOD

To understand what influences creative practitioners when exploring and adopting CSTs, we followed a two-fold approach:

1. Observation. We collected 23 YouTube videos and conducted 13 semi-structured interviews with creative practitioners to gain an initial overview of values across participants.

2. Survey. To verify and contextualize the observed trends with a larger population of practitioners (105 responses), we designed a survey for practitioners to rate and rank the different values.

Questionnaires are available in the supplementary materials and were approved by our organizations’ ethics review.

3.1 YouTube Videos

We chose YouTube's1 comprehensive public video database as a start because this data includes practitioners sharing knowledge through vlogs, tutorials, personal experience, etc, and these videos have a wide reach to general audiences.

Sampling. To sample videos, we queried YouTube keywords such as “why I switched to...” and selected autocomplete suggestions about CSTs. Sample queries include ”why I switched to Figma from Sketch”, ”why I switched from AutoCAD to Revit”. We excluded less CST-relevant queries e.g., ”why I switched to...” ”...iPhone from Android”, ”...formula”. We focused on comparisons and creators reflections, hence we excluded videos mentioning a single CST.

Filtering. We ensured to cover multiple creative domains, such as 3d modeling, software development, creative writing, architecture, video editing, and UI/UX design (Figures 13 in Appendix). To base our data on audience relevance, we selected videos with over 10,000 views. We collected material past data saturation in case a particular domain yielded new findings.

3.2 Semi-Structured Interviews

While the YouTube dataset provides a base data, there are two key limitations. First, the videos shown are decided by the internal algorithm, which has its own biases as defined by its code, advertisements, company sponsorship, audience, and search location, etc.. Second, the videos are crafted by content creators, leading to short narratives designed to capture an audience. To further expand and enrich the data, we interviewed professional practitioners.

3.2.1 Participants. We chose purposeful sampling [9] as recruitment strategy, mixing direct contacts as well as recruitment through a large software company's Slack channel and a university. We interviewed a diverse mix of participants across different practices, ages, organizations, gender, race, location, cultures, and target audiences. We recruited 13 participants (8 male, 5 female) across nine creative fields including graphic design, UX design, architecture, industrial design, software programming, film, game design, and sketching (Figure 12 in Appendix). While we reached data saturation by the 8th participant, we continued interviews to reach a larger coverage of professions/roles. Participants’ ages ranged from 22 to 59 years (M = 33.23, SD = 7.10). Compensation was $50 USD or equivalent for the one hour interviews.

3.2.2 Procedure. Before the interview, participants answered a demographic questionnaire collecting: age, gender, occupation, organization, team size, educational background, professional experience, and expertise in their creative field and in digital CST use.

Interview questions were drawn from a semi-structured interview guide. (Questionnaires are available in the supplementary materials and were approved by our organizations’ ethics review). To ground the discussion, we asked participants to recall the last adopted CST, and the most interesting recent tool adoption. Follow-up questions included: How did you find out about this tool? What motivated you to switch? What alternatives did you consider and why did you choose this tool over others?

3.3 Analysis of Videos and Interview Data

The videos underwent an intelligent transcription, removing pauses, filler words and doing minor grammar adjustments. Analysis included: open coding, focused coding, and thematic clustering [16].

The first two authors independently coded 3 randomly-chosen videos in the dataset through open coding. The two authors discussed the emerging themes and agreed upon a common vocabulary. Once similar codes and themes were identified across many videos with few discrepancies, the two coders finalized the coding scheme and shifted to a focused coding approach. The coders independently coded another 3 randomly-chosen videos in the dataset.

To ensure inter-rater reliability [62], we compared the independent coders’ results from the focused coding. There was a 83.56% to 94.64% agreement level, which translated to a Cohen's Kappa score of 0.58 to 0.71 across all categories. Given the moderate to high agreement, one of the coders independently coded the remaining YouTube video data based on the agreed coding scheme. The first author also coded the interview data under this coding scheme. The two coding authors would have discussions after each interview and identified one new theme from the interviews: maintainability.

We measured: (1) coverage – number of videos and interview participants who mentioned the code; and (2) frequency – number of times a code was mentioned across the data. Figure 2 shows an overview of mentions and coverage of the primary value categories.

3.4 Survey

To further verify our observations, we surveyed 105 creative practitioners to rate and rank each framework value.

3.4.1 Participants. We recruited 105 creative practitioners online: Twitter, Reddit (e.g., r/design, r/userexperience, r/cad), a software company Slack channel, and a university. Participants were screened by email. We also reached out to the 13 interview participants and relevant personal connections. Compensation was $5 USD or equivalent (participants belonged to 8 countries and created content for a diverse set of audiences across cultures and languages).

Participants’ (52 female, 50 male, and 2 non-binary) ages were 19 to 51 (M = 28.26, SD = 5.16). Self-reported experience was: 8 novices, 27 intermediate, 41 proficient, and 25 expert. Average time working in a creative industry was 4.48 years (SD = 3.60). Average time working with digital CSTs was 9.08 years (SD = 7.63).

3.4.2 Questionnaire. In addition to demographics, participants rated their values for each of the codes and framework categories on a scale of 1-5 (1=”none at all”, 2=”a little”, 3=”a moderate amount”, 4=”a lot” and 5=”a great deal”). (Questionnaires are available in the supplementary materials and were approved by our organizations’ ethics review). Participants ranked the main categories with respect to each other into a seven-item ordered list.

4 RESULTS

Sketchup 2022 software application in graphic design p13
Figure 2: Overview of creative practitioners’ value categories. Figure shows mentions, coverage and survey rankings (1: top rated to 7: lowest rating). Categories are sorted by overall rank. Our survey placed features/ functionality, integration with current workflow, and performance as top 3, while support, financial cost, and emotional attachment ranked at the bottom 3.

This section describes the framework on creative practitioners values for CST adoption. The framework's categories and subcategories were derived from the themes identified in the analysis of 23 videos (V01 - V23), 13 interviews (P01 - P13) and 105 survey responses. Figure 2 provides an overview of the 7 categories of our framework, which shows aggregate mentions and coverage, followed by the survey rankings of the categories. This section is organized in the order of the general rankings. For each category, we summarize its values in a figure (e.g., Figure 3) depicting subcategory mentions, coverage, and survey ratings. Average survey ratings determine the order for presenting subcategories in each subsection. This section is restricted to results. Broader reflections and ties with the literature take place in the discussion section (§5]).

Zooming into these value categories, the highest-rated values were a CSTs’ reliable performance (§4.3.1) and ease of use (§4.4.1). On the other hand, the CST's ability to integrate across non-digital and digital media (§4.2.4), customizability (§4.4.7), and customer support (§4.5.3) were mentioned but not valued as much as the other subcategories (see Figure 1 for overview rankings and definitions).

4.1 Tools’ Features and Functionality

A tool's feature is a command or abstraction that achieves a particular goal. For example, this includes atomic commands such as undo and save, as well as interactive features such as drawing on an sketching software. This was a frequently mentioned category in videos and interviews according to mentions and coverage. Participants ranked CST's features as the highest value (Figure 3).

Sketchup 2022 software application in graphic design p13
Figure 3: Features and Functionality values. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal). Values are sorted by survey ratings. Survey shows essential features [§4.1.1] were the most valued, while generalizability [§4.1.5] was the least valued.

4.1.1 Essential Features. The set of features necessary to accomplish a particular creative task as aligned to the CST. This for instance includes typing words in a word processing tool. What features are deemed as essential depend on the practitioner, tool, and domain. To determine whether the feature is essential the question is: ”if this feature is removed, can a practitioner still accomplish their most common goals?” Practitioners valued tools with essential features over complex CSTs loaded with more specialized, less essential features. Essential features are the target for novices when starting in a new creative domain's tool. V07 described Affinity Photo as having essential editing features: ”Some people require the vast amounts of photo editing capabilities that LightRoom and Photoshop have available. I don't need all the bells and whistles”. While impressions are subjective, ”essential” implies a set of features is enough to accomplish most tasks: ”iMovie is way too basic... Da Vinci Resolve was a nice in-between where it was just complex enough for me to make what I wanted to make” (P03). Survey respondents rated Essential Features an average of 4.33 (SD=0.87, Median=5).

4.1.2 Dynamic Responsiveness and Liveness. The ability to see feedback and effects on an object of interest as a feature is being used. Practitioners manipulate virtual objects on a regular basis, and changes are eventually reflected on their output. For example, moving a rectangle in a vector application with the mouse is often reflected live, while rendering a three-dimensional scene might take time to show the results. This feature facilitates fluid creative expression. As P03 describes, ”What makes Unity superior... it has an actual user interface that you can click around and adjust options. Whereas JavaScript that's like change the value from 60 to 50. Change windows. See what happens. You just have to play with numbers and sometimes that is not the most intuitive.” Survey respondents rated this as 4.26 on average (SD=0.94, Median=5).

4.1.3 Collaboration - Awareness, Feedback, Hand-off. The ability to work with others, including awareness of collaborators, feedback and communication, and hand-off to other stakeholders. V05 mentions awareness of collaborators a key value ”you'll see the avatars for each person inside the file, you can also see their cursors moving around”. With respect to feedback and communication, V09 values Figma's collaboration features as it allows them the ”ability to jump into the design file itself... the mood board itself, and again add comments... those comments are captured in a place where actually they become actionable items”. Furthermore, V04 makes a case for better hand-off features, ”you've got your architects, you've got your structural engineers... you're always working with a bunch of different people. Revit allows everybody to work inside of a same file, so this again eliminates chance for user error, and also eliminates a chance for clashes.”. Survey respondents rated this value as 3.80 on average (SD=0.96, Median=4).

4.1.4 Specialization. The ability to do unique, specialized creative tasks using features with high precision and control. Contrasting this with Essential Features, Specialization features can include non-essential features. A function such as content-aware fill in Photoshop would be considered specialization, whereas adjusting the lighting of a photo would be an essential feature. V01 states: ”DaVinci Resolve is a great app for the color grading features”. In fact, P02 described mixing DaVinci Resolve into their workflow with Adobe Premiere Pro exclusively to adjust the colour and tone of their videos despite Premiere Pro having colour adjustment capabilities. P06 mentions how ”3DS Max does rendering better than any other software tool, so I will use that for just the rendering phase”. This was rated 3.65 on average in the survey (SD=1.04, Median=4).

4.1.5 Generalizability. The general- or multi purpose nature of a CST, where it can be used for various creative tasks and domains P08 illustrates how this led to choosing Figma over Tableau: ”Tableau is very specific to data visualization. And it's very useful in a design setting. It's really useful at the beginning... but it suffers a little bit when... you're trying to polish a prototype. Since not all of our projects are data visualization, we needed a more general-purpose tool. Therefore, we chose Figma where we can use it for more than just InfoVis design”. Similarly, P06 shared: ”we use 3D Studio Max... it's like a Swiss army knife and can read lots of different forms of data, probably more so than any of our other software.” Survey respondents reported valuing general-purpose tools at an average of 3.56 on the five-point scale (SD=1.12, Median=4).

4.2 Integration with Existing Workflow

How well different elements work together or co-exist in an ecology of tools and devices. All interview participants and videos mentioned they value tools fitting into their creative workflow (Figure 4). Survey participants on average ranked this category second out of the seven primary categories

Sketchup 2022 software application in graphic design p13
Figure 4: Integration with current workflow values. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal). Values are sorted by survey ratings. Our survey shows practitioners valued integration across tools the most [§4.2.1] and across analog and digital media the least [§4.2.4].

4.2.1 Integration Across Tools. How well the tool interconnects with other tools. This can be either by combining functions from other tools into this tool, or through plugins, exporting and importing features, etc. For instance, P06 mentions abandoning a tool because of problems with exporting and interchanging formats, ”I hate when anyone gives me data from SketchUp. Like even if they translate it to another piece of another format that I can read in my tool, it will come in very unstructured and requires a lot of rework” V12 gives another example, ”the main feature though that i really think sets Premiere Pro apart in this category is dynamic link. This means I can seamlessly switch between Premiere Pro and After Effects and have all of my changes perfectly reflected.”.

4.2.2 Integration Across Devices. How well the tool supports creative work done across other devices used in creative workflow. Many practitioners talked about working across multiple devices, such as mobile devices, cameras, and computers. V13 mentions this was the major reason for adopting a tool, because ”you can use [Figma] whether you're on a Mac or a PC. So, for all those people who keep asking me if there's a Sketch alternative for PC, this is now my answer”. Poor device integration can be cumbersome and push people to abandon CSTs. P10 describes how they ”use different pens on different devices across Apple, newer Microsoft versions, and Android versions, and they are usually incompatible across each other. This doesn't really work with me”. Similarly, P09 describes how a mobile-only environment optimizes for working with social media: ”Even though I was taught to use the Adobe apps in school, I use the apps that are available on my iPhone... apps like Mojo,... [Adobe] Spark, because it's easier to create graphics. So I don't need to go open a program on my computer and import all the files, export then upload again to my phone. I save time when I do everything on my phone”. The survey rated this value 3.98 on average (SD=1.01, Median=4).

4.2.3 Integration Across Creative Stages. How well the tool supports different stages of a creative project such as ideation or prototyping. In some cases, this overlaps with tool integration, as import-export functionality enables easier movement across stages. P05 describes, ”You can gather feedback in there. You can do brainstorms, and all the files are inside of Figma. So it's really easy to apply whatever you're looking for within the app itself. You're able to prototype in Figma. And there's even new features coming out that let you prototype components and do developer hand-offs. And that was the biggest pull for us to switch over as a team”. Survey participants rated this value an average of 3.74 (SD=0.96, Median=4).

4.2.4 Integration Across Analog and Digital Media. How well the CST supports smooth transfer between digital and non-digital media. Creative practitioners work across both digital and analog tools such as paper, whiteboards, and pens. P10 talks about their workflow while sketching, ”Sometimes I have paper sketches that on my drawing analog tools, on my sketchbooks, that I want to digitize. I use the different versions of the Adobe Lens where you can capture them and then it converts them into a vector drawing”. Overall, survey participants rated valuing this only a moderate amount (M=2.82, SD=1.23, Median=3).

4.3 Tools’ Performance

Sketchup 2022 software application in graphic design p13
Figure 5: Performance values. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal). Values are sorted by survey ratings. Our survey shows practitioners valued reliability the most [§4.3.1], and storage performance the least [§4.3.5].

Refers to the level of consistency in execution, processing speed and storage required to produce artifacts, quality of outputs, effort required to maintain projects. 12 interviewees and 16 videos mentioned this 147 times (Figure 5). On the survey, performance ranked third out of the seven major categories.

4.3.1 Reliability. Consistency in performance, such as applications behaving as expected and not crashing. Reliability was rated as the most valued quality across all primary and secondary value categories. P03 talks about switching tools even though, ”the workflow would be the exact same. I just think that the changes come in terms of quality of life and not having the software crash on me all the time.” P07, a Creative Coder, faced similar issues, ”another deal breaker is if a tool glitches out often or is just annoying to work with, and it frequently crashes on me, I lose work and everything takes twice as long, just because the thing is unstable, then I would also definitely avoid it.”. Survey respondents on average rated this a 4.67 (SD=0.70, Median=5).

4.3.2 Quality of Outputs. Quality, accuracy and excellence of finished creative artifacts created When discussing LaTeX vs Markdown V18 stated ”the cool thing about LaTeX is that it looks very very professional.” Similarly talking about 3D modelling V17 mentioned ”3DS Max excels in animation and also very high quality and good renders and that's why I would choose it”. While it was not mentioned as frequently as other codes in this category across videos and participants, survey respondents rated highly valuing this (M=4.13, SD=0.91, Median=4).

4.3.3 Maintainability. Ease with which creative projects can be maintained on this tool over a long time period. For example, P07 describes, ”So, one thing that I usually check is the maturity of the tool... I don't want to be maintaining the infrastructure myself. Doing all the system updates, etc. on your own time because the company is not paying you for this extra work”. Similarly, P12 mentions the difficulty of maintaining software libraries over time, stating ”you've got to kind of think about versioning and there's breaking changes in every major release”. Survey respondents valued maintainability reasonably high (M=3.98, SD=0.97, Median=4).

4.3.4 Processing Speed and Algorithm Sophistication. The time taken and ability to leverage resources for the tool to process and complete a task. Examples include preview, as well rendering time in the context of video, as highlighted by V21: ”I was using Resolve more and... you can easily feel the gain in performance, when you load clips or when you scrub through your footage, or your audio. I also measure the rendering time on each software... Resolve is just a little faster”. Survey participants rated valuing this on average 3.88 out of 5 (Median=4, SD=1.01).

4.3.5 Storage. The amount of storage space required to run the tool either locally or on the cloud. V14 mentions how storage plays a role when installing the software ”the install package was only around 300MB, which is considerably smaller than AutoCAD”. V09 reflected on concerns of cloud-only storage, ”I couldn't have files installed in my computer and work from locally, it really gave me a lot of anxiety”. On the other hand, V20 considers cloud storage a positive, ”if I lost a hard drive or if my hard drive is broken at least my design files are safe”. While this was mentioned 25 times across 6 interviews and 7 participants, a software bug in the survey collection prevented collecting ratings on how valuable storage was compared to the other performance values (Figure 5).

4.4 User Interface and Experience

Sketchup 2022 software application in graphic design p13
Figure 6: User Interface and Experience values. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal). Values are sorted by survey ratings. Our survey shows practitioners valued ease of use the most [§4.4.1], and customizability the least [§4.4.7].

Components related to how people interact with their CSTs. 13 interviewees and 15 videos mentioned the interface and experience a total of 514 times (Figure 6). Survey participants, on average, ranked this fourth out of seven primary categories when considering the overall impact to adoption.

4.4.1 Ease of Use. The ease with which users can achieve their goals effectively P13, an architect, talks about how usability factors in CST adoption, ”Rhino to me is so intuitive and I value that a lot. Even though I learned Blender and SketchUp in college, I never use them because they were never intuitive to me”. Overall, survey respondents rated this as the second-most valuable feature across all secondary categories with an average of 4.37 (SD=0.75, Median=5).

4.4.2 Interaction Language. The mental model or process required to accomplish a creative goal. P04 illustrates how this plays a role in choosing which CST to adopt: ”a button is a button is a button, no matter where you see it. And because of the nature of this particular UI [referring to their design], it had a lot of common elements that got repeated over and over again. And illustrator was awful. It was like painting with a sledgehammer. We would make a change somewhere and then we'd have to find the 500 other locations where that particular element was used and make that change. so it was, it was very much an uphill battle. At one point we decided to change the font and it was not fun. Even slight color tweaks were a nightmare.” Survey participants highly valued it, rating it a 4.15 on average (SD=0.78, Median=4).

4.4.3 Ease of Experimentation and Startup. Ability to quickly get started, achieve results and generate variations. CSTs have different scaffolds and resources to reduce time and effort to try out new ideas, methods and prototypes or start a new project. Starting from a blank canvas can be overwhelming. To reduce this some CSTs provide walk-through tutorials, templates, examples, etc. to help get started with a project and try out the tool. P04 talks about the startup costs: ”not having to go through a million steps to get the tool up and running is definitely a deal maker”. P03 also talks about the ability to experiment, ”Seeing it all next to each other allows me to play around, trial and error and spin up a bunch of characters really quickly”. P05, a graphic designer, talks about how startup costs affect how their team selects CSTs: ”We like to describe it as how heavy the tool is. It's like Premiere Pro, how long does it take to boot up, get everything going. And how quick can you wound up though your load, your files, and then go through the edits that you're making. There's certain tools, like let's say Photoshop, that's really slow and clunky. And a lot of times we'll ditch it and do things like banner ads in Figma, just because it's so light weight”. Survey respondents valued this on average 4.11 out of 5 (SD=0.80, Median=4).

4.4.4 Learning Curve. Time taken to become proficient using a CST skillfully P03 says, ”I was looking at Adobe Illustrator too, and I just kind of figured that the learning curve for something like that was a bit too high for what I want to pursue. So I went with Sketch since it was a little bit more simple, cause I wanted to focus on minimalist designs”. On average, survey participants rated learning curve at 3.98 (SD = 0.98, Median=4).

4.4.5 Aesthetics + Organization. Visual embellishment, layout and design of the tool including color, animation, imagery, and iconography. Aesthetic UI elements can create an impression on what the tool feels like (e.g., ”feeling modern”, or ”outdated”, feeling ”fun”, etc.). Moreover, the general layout can make a tool feel more or less ”overwhelming”. Illustrating its importance of aesthetics, P02 says, ”The layout and colours and design of the software itself, not the work, makes me use it. In a normal week, I stay 8 hours for 5 days in front of that software. I don't want to see ugly colors and rectangles. I don't want to feel like I'm working in a 1960s factory”. P08 also brought up the role of aesthetics, suggesting that UX tools are bound to look ”more modern given that they are newer” and thus aesthetic qualities can be easily overlooked. P03 echoes similar values, ”The interface seemed really clean. I don't know, people look at the Photoshop or I guess Adobe Illustrators’ interface and there's like so much stuff everywhere. It can be really overwhelming to look at, but Sketch had a very light interface that was minimalistically designed, it was pretty intuitive, get the grasp of, and I wanted to do more graphic design things and have fun.” Survey participants rated it an average of 3.55 (SD=1.13, Median=4).

4.4.6 Similarity of UI to Other Tools. Similarity of interface and or user experience across tools currently used or tools used in the past. Part of it may draw from consistency across tools in the same suite of applications, or as transfer from different software with overlapping functionality. P06 acknowledges: ”It's just knowing that if I pick a tool to do this, it's similar to the tool in another piece of software, by the same company that I picked to do the same thing and they're going to behave the same way”. P12, also talks about this, ”I've adopted P5.JS for creative coding. So that's the web-based version of processing. It has a very similar syntax... it's based on Java script and Java, which makes it nice”. Survey participants rate this an average of 3.49 (SD=1.01, Median=4).

4.4.7 Customizability. Extent to which the interface and functionality can be modified. For example, P11, an architect shared how ”changing the interface in AutoCAD to dark mode and organize the toolbars” made it feel easier to use, while stressing that every architect has a completely different personalized interface for AutoCAD. On the other hand, P13, another architect, mentioned designing a plugin that modified the functionality, ”I've designed a plugin to puncture the building with different types of windows. This allows me to express myself more creatively”. Survey participants rated customizability to be the lowest value for User Interface and Experience (M=2.98, SD=1.16, Median=3).

4.5 Level of Support

Sketchup 2022 software application in graphic design p13
Figure 7: Level of support values. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal). Values are sorted by survey ratings. Our survey shows practitioners valued tutorials and documentation the most [§4.5.1], and customer support the least [§4.5.3].

The availability of resources that can provide assistance in navigating a tool, such as tutorials, communities of users, and customer support. 9 out of 23 videos and 11 out of 13 interviews talked about how the role of resources for learning how to use the tool affects their decision-making process – specifically tutorials, the community of other creative practitioners using the tool and customer support from the tool's developers (Figure 7). Survey participants, on average, ranked this category as fifth out of the seven primary categories.

4.5.1 Tutorials and Documentation. The availability of online software learning resources such as video and blog tutorials, and developer documentation. P03 reflects on visual design for games: ”the main challenge was that for something like Sketch at the time, there weren't as many resources or tutorials compared to something like Photoshop or Illustrator. This lack of resources... was kind of an issue and that's why I didn't choose it”. Survey participants rated this an average of 3.94 (SD=0.95, Median=4).

4.5.2 Community of Users and Developers. The availability of support on online and offline communities including friends, collaborators, online forums. P04 reflects on their teams’ decision-making process, ”we looked into whether there was an active community of users, not so much because we wanted to be involved in the community or anything like that. But if other people cared about [the tool], that's a good sign to us that, there's a reason to care about it and that there will be help when problem-solving later.” V07 explains, ”I don't feel like they listen to the community quite as much as say, Affinity, or some other programs out there. The company that makes Procreate, they're really great about listening to their community and implement changes.” V17 also shapes tool decisions based on community: ”one advantage of SolidWorks is that it does have a larger user community, and so when you go and want to look for learning resources, templates, plugins, etc. it's much easier to find those for SolidWorks.” Survey participants rated this an average of 3.92 (SD=0.10, Median=4).

4.5.3 Customer Support. The availability of support from the CSTs developers (e.g., developer representatives, live chat). P06 discusses their decision to use a rendering software, ”[the tool] has a fighter pilot interface, right, like a lot of tools. I would have a very hard time adopting it, if I'm being quite honest, if we didn't have the guy who wrote the software, working with us to get all the infrastructure configured because that's a whole another game”. Survey participants rated this an average of 3.11 (SD=1.30, Median=3).

4.6 Financial Costs

Monetary costs to use the tool individually or with collaborators, a subscription- or perpetual license-based business model, or buying one tool vs a bundle. V05 talks about how,”many people are leaving the Adobe subscription just to get a finished software because it's a one-time purchase instead of subscribing to a platform of other tools that they might never use”. V14 talks about the value of using a tool that brings in collaborators, and other stakeholders into the same design file, like Figma: ”I think that's really cool and worth the twelve dollars, you can send out a link to anybody for free since it's web-based. so there's no need to pay for any sort of seats like in other prototyping tools.” Financial values were discussed only 28 times overall, 34 times across 16 videos and 35 times across 12 interviews (Figure 8). This category ranks fifth based on frequency of mentions and third based on coverage across the primary categories. Survey participants, on average, ranked this category as second-last out of the seven major categories.

Sketchup 2022 software application in graphic design p13
Figure 8: Financial costs of CSTs [§4.6]. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal).

4.7 Emotional Connection

Feeling a sense of happiness, identity and belonging, ethical responsibility, etc. when using the tool (Figure 9). For example, P10 mentions a sketching tool that ”really brings a smile on your face every time you use it... I feel really happy and at home using this”. P04 and P13 talk about feeling a sense of ethical responsibility when choosing a tool. P04 said, ”this company already owns 90 percent of the market share and is increasingly dictating the industry standards and pushing for all sorts of proprietary stuff. I figured they didn't need to control any more of it. So I'll take my particular, tiny little chunk of business and go elsewhere”. P13 echoes similar concerns, stating: ”I feel really nervous doing an entire project in only one company's umbrella of applications. What if they suddenly make changes that makes it really hard to recover the work”. Survey participants, on average, ranked this last out of the main categories (Figure 2).

Sketchup 2022 software application in graphic design p13
Figure 9: Emotional connection with CSTs [§4.7]. Figure shows mentions, coverage and survey ratings (where 1: no value at all, 5: value a great deal).

4.8 Exploration and Discovery of Creativity Support Tools

While the previous categories refer to values considered when choosing to adopt CSTs, we also wanted to RQ2: how practitioners discover new CSTs and what influences their exploration process. Some creative practitioners explore tools because they are intrinsically motivated to keep learning, or are extrinsically motivated by industry trends and role changes. Often people retain tools because it is the industry standard (e.g., AutoCAD in architecture). Experience with using a tool often acts as inertia that might keep people from switching. (Figure 10). Since the survey was to primarily verify and rank practitioners’ values when choosing to adopt CSTs, and not about how they discovered their CSTs, this was not included as a question in the survey.

4.8.1 Personal Recommendations. Discovering CSTs through personal recommendations from friends, collaborators and social connections. P12 describes how their social circle alerts them of new tools: “Each of my conversations with students, collaborators, friends, is almost like a radar”. P3 also describes how social interactions lead to new discoveries: ”I was at a hackathon and I saw someone creating a poster, with that tool. I thought it was really cool how fast his workflow was. The interface seemed really clean too”.

Sketchup 2022 software application in graphic design p13
Figure 10: How practitioners discover and explore CSTs. Figure shows mentions and coverage in interviews and video. Values are sorted by mentions. Our survey shows practitioners’ explorations were most influenced by personal recommendations [§4.8.1] and least by industry trends[§4.8.6].

4.8.2 Role Change. Feeling a need to adopt new CSTs to adapt to changes in their role, organization or situation. These include role changes such as a students becoming a industry professionals, individuals changing jobs, teams shifting to remote work etc. P09 talked about their transition from student to industry: ”at university we had training in Adobe Creative Cloud, so Photoshop, Premiere and, Illustrator. So I had experience working with those software for user experience... The tools I used changed because I work with social media and all that new media now, like TikTok, Instagram, and Twitter. So sometimes I don't use desktop software at all and just use phone apps.” Some participants’ tool use was restricted due to organizational requirements. P01 mentioned that they ”work for the government, so I think the regulations are pretty strict here. I'm not allowed to install any tools on my laptop by myself. And, actually if I want to get a new tool, which I try to, I have to fill out a form, send it to someone and they will decide if I get the tool or if there's an equivalent that is considered secure that they will give me”. This motivated P01 to use web applications that did not require installation.

4.8.3 Search and Social Media. Discovery of CSTs by searching the web or getting recommendations from people on social media, forums, or blogs. P04 mentioned ”[finding] OnShape on one of like the 3d printing forums... Blender's huge from an online presence perspective... lots of people talking about it all the time”.

4.8.4 Industry Standards. Exploration is influenced by CSTs that are standard practice in a creative industry P02 gives an example of a standard practice CST, ”Most interesting tool I've adopted is DaVinci Resolve... because it has been used in Hollywood and the entire film industry around the world for the last 20 years as the primary color grading software”. V03 mentions that “BIM software like Revit, Vectorworks and ArchiCAD are really the industry standard… if you want to work on structures that are larger than homes you'll need to learn BIM to secure a job at a large firm”. While most practitioners talk about industry standards as being a motivating factor to adopt new tools, V16 reflects on how industry standards make it harder to adopt new tools at the organizational level: ”Sketch is still the industry standard and, so to respect our clients we just need to maintain that as our design tool of choice for now.”

4.8.5 Experience. Exploration is influenced by psychological inertia – a tendency to maintain the status quo and avoid changes due to comfort. P04 reflected on how their team had to assess adopting new CSTs ”knowing that we already had an entire workflow that worked well and thousands of hours of experience in Adobe illustrator. Like, yeah, I'm not going to abandon my entire illustration workflow. I have thousands of hours in Adobe illustrator. It's a pretty big deal for me to switch... But I had to make an informed decision. I think it took us like two work days to decide that we are going to reinvent our entire workflow. We basically rebuilt everything we had done for that project up until then in a matter of a few hours. and that was enough to convince us that, yes, this [CST] is the future.”.

4.8.6 Industry Trends. Exploration is influenced by trends in the creative industry by other creative practitioners, tech advancements, etc. P06 shared: ”none of us want to be dinosaurs, so we try to stay as fresh and relevant”. Similarly, P12 believes they ”tend to gravitate towards things that are new and exciting because, and things that are trending and industry, because those things, there's a reason why they're trending in industry... there's a reason why a lot of these different libraries and frameworks are so popular.”

5 DISCUSSION: TIES BETWEEN FRAMEWORK AND LITERATURE

Sketchup 2022 software application in graphic design p13
Figure 11: Creative Practitioners’ values and how they fit within existing literature across systems and creativity support tools research, and technology acceptance and adoption theories. Grayed out boxes show there is no corresponding mapping.

The systematic qualitative analysis of practitioners reflecting on their values across the video, interview and survey responses, come together as a framework of practitioner values and rankings when adopting CSTs (Figure 2). As we discuss our findings: the practitioners’ values, and their ratings of how important each value is, we draw connections to design principles and evaluation heuristics in existing fundamental relevant literature (Figure 11). The table in Figure 11 includes fundamental relevant papers as they tie to our framework, (papers with over 100 citations, and overlap with two or more values in our framework). Papers overlapping a single cell are discussed in-line throughout this section, where core terms are drawn from our definitions in [§2]. These values and observations prompt further reflection for the wider community of systems creators, researchers, marketers, and educators, on how creative practitioners relate to their tools.

5.1 Features/Functionality

The tool's features were ranked the most important consideration in the survey ranking, and were an integral part of practitioners’ decision-making process. This prominence was not surprising, as the CST's features propel individuals to create their content and shape their workflows. Systems and CST research focus largely on the types of features/functionality CSTs should definitely have. Generalizability was the most covered value with papers referring to it also as ”High Ceiling” [46, 59, 70], ”Wide Wall” [59, 70] or ”Generality” [51]. CST research also emphasizes the focus on collaboration [7, 59, 70, 78]. The ability to see real-time, dynamic updates to their designs was important to practitioners. Prior literature suggests that this feature facilitates more fluid interaction is tied to ”expressive match” [51] and ”observability” [60].

Creative practitioners also prefer CSTs that have a unique design specialization and minimal, essential features. While most of CST research in HCI are low complexity tools that contain one or two features to accomplish one or two specific tasks [27], CST products are often complex feature-packed systems (e.g., [1, 2]). Future research should further explore creative practitioners the relation between feature preferences and CST adoption.

5.2 Integration with Current Workflow

During the course of a creative project, a practitioner often works across tools, devices, creative stages, and analog and digital media. Evaluating how well a CST fits into their existing ecosystem and creative practice was the second most valuable category. Prior literature has talked about integration with other tools by supporting exportability, combined functionality, plugins etc. using terms such as ”simplifying interconnection” [51], ”support open interchange” [59], and ”compatibility” [60]. Cross-device integration [12], ubiquitous computing [80] are their own sub-fields within HCI and a lot of CSTs aim to support this [5, 13, 50]. Most CSTs in HCI research are built to support specific creative stages, with idea generation being the most commonly supported creative process [27, 28]. Surprisingly, only few papers explore how systems might work across different stages (e.g., [39, 67]), which should be deemed as an evaluation metric in its own right. In contrast, the CST industry is creating tools that expand across multiple stages (e.g., Figma covers brainstorming, prototyping; Da Vinci Resolve covers color grading, editing, VFX) [75]. With the rapid shift to remote work, there has been an increased switch favouring digital CSTs and workflows [75]. That said, practitioners continue to work with analog and digital media [27]. Further work is needed to explore varying levels of integration. For instance, should individual tools merge into a large system that supports all integration, as done by say, Affinity Publisher incorporating photo and vector editing, or should tools remain light weight with seamless import and export across them?

5.3 Performance

Based on how CSTs are marketed and the focus on theoretical models of tool adoption, we expected performance to be a key value considered by creative practitioners. However, we did not anticipate seeing the many ways in which practitioners assess performance. Reflecting on the results, maintainability was mentioned by interviewees, but not reflected in the videos, perhaps because videos aim to introduce tools to viewers, rather than discuss long term project and the impact to team members and stakeholders. Many of these practices are largely left to individuals to self-organize: naming layers, commenting code, or file management.

Rogers’ theory of technology acceptance [60] refers to these as a ”relative advantage”. On the other hand, despite systems research valuing performance [38], peformance is rarely treated as a design heuristic. This may be due to performance being largely tied to implementation rather than concepts, often falling beyond the scope of many research projects. With the progress and democratization in areas like cloud computing and computer graphics, these performance aspects will continue to evolve. Developers and researchers can use performance expectations to innovate in a more human-centered manner (e.g., via feedforward). These values can also be used by educators when choosing tools to teach, and by businesses to differentiate their products from the rest.

5.4 User Interface and Experience

Current practice in HCI sometimes advocates for usability evaluation as a key part of every design process. This is for good reason: usability evaluation has a significant role to play when conditions warrant it [31, 32, 52, 63]. This tie to usability is reflected by how well ”Ease of Use” (row 1 in this category) corresponds with existing literature [3, 8, 47, 51, 59]. However, creative practitioners’ CST adoption criteria goes beyond usability to also include interaction language, ease of experimentation and startup, learning curve, aesthetics and layout, UI similarity to other tools and customizability. Reflecting on our results, customizability was not mentioned in the videos, likely due to videos targeting first-time audiences. Moreover, highly customized software makes it inconsistent across people which can hinder other aspects such as support.

Within the value framework, interface and user experience might appear similar to features and functionality (§4.1, §5.1). Yet, features and functionality describes commands or abstractions to achieve a creative goal (e.g., liveness and collaboration features), whereas user interface and experience refer to values related to how people interact with CSTs (e.g. ease of use, learning curve, etc.).

Systems and CST research share a focus on interaction language and learning curve referring to these as ”path of least resistance”, ”predictability”[46], ”viscosity/fluidity” [8, 30], ”solution viscosity” [51]; and ”low threshold” [46, 59, 70], ”hidden dependencies”, ”role responsiveness” [51], ”black boxes of explorability” [59] and ”exploratory search” [70], respectively [8, 30]. The high level of overlap is likely because as suggested by Greenberg [31], the general approach sets expectations from problem solving and shapes how practitioners think and work with tools. Aesthetics appeared to be easily overlooked, yet much research suggests it might tie to unconscious processes that shape how people feel about a particular tool [32, 47, 48]. Future research can uncover the impact of varying elements to these subcategories to CST adoption.

5.5 Level of Support

The field of software learning within HCI research aims to understand and scaffold the use of complex CSTs. Surprisingly, support, which often has large investment from firms, rated rather low. Past systems have helped leverage learning resources into existing tools [11, 25, 33, 45]. While we tie these elements to how tools might be adopted by creative practitioners, further work might consider how to more tightly integrate support and adoption.

5.6 Financial Costs

Most theories of technology acceptance and adoption include monetary cost as a parameter. Yet, practitioners consider factors beyond these theories: subscriptions vs one-time purchases, bundles, collaboration cost, etc. Over time, considerations may change.

Based on how CSTs are marketed and the focus of theoretical models of tool adoption, we expected the monetary costs to be a key value for creative practitioners. When coding the videos and interviews we hypothesized that the low ranking must be due to self-report and social desirability bias. However, even in anonymous survey responses, participants consistently ranked it as the second-to-last valuable category. This might be due to differences in pricing across creative domains (e.g., software development CSTs are usually free while architecture and 2D vector CSTs are usually paid). Industry standards around pricing may accustom practitioners to certain prices. Investigating how practitioners perceive financial cost beyond monetary value will be beneficial for CST developers, marketers and companies. In some cases, we saw practitioners are more than willing to pay for products provided they benefit from their use compared to alternatives. We also found it interesting to see new business models appear featuring usage tiers that mix one-time purchases with smaller subscription feature sets.

5.7 Emotional Connection

Feeling an emotional connection and identifying with a tool was the least valued category in the framework. When coding the videos and interviews, we assumed the low frequency might be because of self-report biases when talking about emotion and the feeling of using the CST [6, 36]. The consistent low rank in survey responses might be because participants were ranking these based on how much each value influences their ability to accomplish creative goals. Values such as emotional connection and identifying with a tool, have yet to be explored in depth in literature. Nouwens and Klokmose [49] start to explore how knowledge workers have emotional connections to the applications they use. There is also a recent movement to create designs that evoke emotions to drive positive user experiences, either viscerally, behaviorally or reflectively [48]. We believe this is under-investigated, and might be similar to how practitioners are drawn to analog tools, pens and notebooks because of how these tools make them feel. The emotional connection can be interesting to investigate on its own.

5.8 Exploration and Discovery

The tool discovering mechanism is via personal recommendations. Yet, factors such as role changes, search and social media, industry standards and trends, and the experience or inertia affect the exploration process. Some of these have been previously studied. For example, marketing and social science research talks about how (1) a customer's inertia or knowledge in a tool can hinder exploration and tool switching [29], (2) blogger and social media recommendations affect product purchase intentions [42], and (3) market trends of products and industries affect CST development [27, 76, 81].

These creative practitioner values illustrate that CSTs are not individually-siloed tools [10, 13, 47, 80], rather a much larger complex ecosystem of people, tools, activities, and sets of technologies.

6 LIMITATIONS AND FUTURE WORK

This study triangulates self-report data from three diverse data sources. While YouTube video data lacks richness and details because of its audience, biases, and format, the semi-structured interviews with creative practitioners provided a rich first-hand account on their values. Long form semi-structured interviews do not provide a sense of scales, which merited verifying creative practitioners’ values through the surveying the practitioners. Combining these approaches help build a deep and rich, qualitative understanding of creative practitioners’ values. However, as with any methodology, there are trade-offs: self-report data may have gaps or inconsistencies with actual observed behavior, controlled, questionnaire performance may differ from natural search behavior in unanticipated ways, valuations are done in a short amount of time and based on our textual descriptions, and CST log analysis can provide local, granular in-situ data, but lack qualitative depth, etc. One of the realities of qualitative coding is that it draws influences from authors’ pre-existing knowledge when coding. While the coding was conducted independently by two authors and the inter-rater reliability was strong and significant, future quantitative and qualitative analyses of long-term CST usage both in the lab and in the wild will further expand and contextualize these initial results.

In an effort to standardize CST evaluation methods and go beyond usability as an evaluation approach [24, 35, 38, 55, 56, 57, 59, 69], HCI researchers have developed a range of quantitative methods such as the Creativity Support Index [14, 17], reflecting the whole breadth of HCI evaluation techniques [27, 70]. Our Framework brings the creative practitioner's perspective as a way to look at CSTs for long term adoption, retention and abandonment. Creativity research shows that creativity is subjective and based on the practitioners’ background [19, 44, 53]. We suspect other aspects of a practitioner's background may play a role in CST adoption. In our investigation, we collected data from people across 19 different creative professions (seven across interviews as seen in Figure 12, ten across YouTube videos as seen in Figure 13 and fifteen across our survey). However, despite collecting background information, such as experience/ expertise, education, and demographics,this was not a well-balanced representative sample to confidently identify trends in how background affects CST adoption. We believe our framework can help future research as a set of reflective heuristics in an evaluation toolbox (such as [38]). What makes a tool successful or impactful is not a one-size-fits-all approach.

We hope that the values and observations prompt further reflection for the wider community of CST systems creators, researchers, marketers, and educators, on how practitioners relate to their tools. For example, HCI researchers and CST developers could use this framework to identify innovation gaps and opportunities unaddressed by current CSTs, and motivate development of novel CSTs almost as values for design spaces or competitor analysis. CST marketers could use this framework to understand customers’ needs and wants and market the tools accordingly. Educators can assess CSTs when choosing tools to include in their curriculum and aim for best student development. Novice and expert creative practitioners can also use this framework to reflect on their own values.

7 CONCLUSION

The rapidly evolving landscape of diverse Creativity Support Tools, makes it imperative for creative practitioners to constantly explore and decide to adopt, retain, or abandon CSTs to reach their creative potential. This paper presents a conceptual framework of creative practitioners’ values for discovery, adoption, retention and abandonment of CSTs informed by empirical observations of creative practitioners’ values across 23 YouTube videos, 13 interviews and 105 survey responses. This uncovers creative practitioners’ perspective in contrast to existing to developer, educator or researcher-centric angles. To encourage reflection from the various CST stakeholders, we further tie creative practitioners’ values into existing design heuristics and principles in systems, CSTs, and theoretical technology adoption research. This practitioner perspective exposes that values do not revolve around individual siloed systems, rather the larger complex ecosystem of people, their activities, workflows, and sets of technologies at the tool- as well as device-level.

APPENDIX

Sketchup 2022 software application in graphic design p13
Figure 12: Overview of participants analyzed.
Sketchup 2022 software application in graphic design p13
Figure 13: Overview of videos analyzed.

Semi-Structured Interview Guide

  1. Generally, tell us about who you are and what you do?
  2. When was the last time you adopted a new tool into your creative workflow? Could you walk me through your project and how you used this tool in that project?
  3. How did you find this new tool? Describe your process of exploring the tool landscape for a new tool either using technology such as search engines or asking other people.
  4. Were there any alternative tools you explored? Why did you choose to use this compared to other alternatives? What factors/trade-offs affected your decision to use this application?
  5. Reflecting on your experience exploring alternatives and making this decision, what information helped you make your decision to adopt this new tool? What additional information or analysis do you wish someone had provided you with to have made your exploration and decision more convenient?
  6. Were there any challenges/frustrations when adopting a new tool into your workflow?
  7. Did you use any strategies to overcome these challenges or make the transition to a new tool easier?
  8. Are there other tools you have changed in the past? For example, software suite such as Office, Google Docs, and the Adobe Suite, operating systems, etc. Note: if participants are chosen through judgement sampling we may ask about specific software change (e.g. “We know that as a developer you have gone through different Javascript frameworks in the past, could you tell us more about some of the decisions behind these changes?)
  9. Overall, what are the deal makers or deal breakers for you in choosing a tool?

Survey Questions for Creative Practitioners Questions: linked here https://tinyurl.com/CreativePractitionerSurvey

To see the codes each interviewee and video mentioned, refer to the table linked here: https://tinyurl.com/CodingResults

To see the additional figures check out this folder linked here: https://tinyurl.com/FigureFolder

REFERENCES

  • [n.d.]. https://helpx.adobe.com/photoshop/using/tools.html
  • 2021. AutoCAD Commands & Shortcuts for Beginners. https://all3dp.com/2/autocad-commands-shortcuts/
  • Paul S Adler and Terry Winograd. 1992. The Usability Challenge.
  • Icek Ajzen. 2011. The theory of planned behaviour: Reactions and reflections.
  • Sriram Karthik Badam and Niklas Elmqvist. 2014. Polychrome: A cross-device framework for collaborative web visualization. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces. 109–118.
  • Lisa Feldman Barrett. 2004. Feelings or words? Understanding the content in self-report ratings of experienced emotion.Journal of personality and social psychology 87, 2(2004), 266.
  • Pernille Bjørn and Nina Boulus-Rødje. 2015. The multiple intersecting sites of design in CSCW research. Computer Supported Cooperative Work (CSCW) 24, 4 (2015), 319–351.
  • Alan F Blackwell, Carol Britton, A Cox, Thomas RG Green, Corin Gurr, Gada Kadoda, MS Kutar, Martin Loomes, Chrystopher L Nehaniv, Marian Petre, et al. 2001. Cognitive dimensions of notations: Design tools for cognitive technology. In International Conference on Cognitive Technology. Springer, 325–341.
  • Ann Blandford, Dominic Furniss, and Stephann Makri. 2016. Qualitative HCI research: Going behind the scenes. Synthesis lectures on human-centered informatics 9, 1(2016), 1–115.
  • Susanne Bødker and Clemens Nylandsted Klokmose. 2011. The human–artifact model: An activity theoretical approach to artifact ecologies. Human–Computer Interaction 26, 4 (2011), 315–371.
  • Joel Brandt, Mira Dontcheva, Marcos Weskamp, and Scott R Klemmer. 2010. Example-centric programming: integrating web search into the development environment. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 513–522.
  • Frederik Brudy, Christian Holz, Roman Rädle, Chi-Jui Wu, Steven Houben, Clemens Nylandsted Klokmose, and Nicolai Marquardt. 2019. Cross-device taxonomy: Survey, opportunities and challenges of interactions spanning across multiple devices. In Proceedings of the 2019 chi conference on human factors in computing systems. 1–28.
  • Frederik Brudy, David Ledo, Michel Pahud, Nathalie Henry Riche, Christian Holz, Anand Waghmare, Hemant Bhaskar Surale, Marcus Peinado, Xiaokuan Zhang, Shannon Joyner, et al. 2020. SurfaceFleet: Exploring Distributed Interactions Unbounded from Device, Application, User, and Time. In Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 7–21.
  • Erin A Carroll and Celine Latulipe. 2009. The creativity support index. In CHI’09 Extended Abstracts on Human Factors in Computing Systems. 4009–4014.
  • Erin A Carroll and Celine Latulipe. 2012. Triangulating the personal creative experience: self-report, external judgments, and physiology. In Proceedings of Graphics Interface 2012. 53–60.
  • Kathy Charmaz. 2014. Constructing grounded theory. sage.
  • Erin Cherry and Celine Latulipe. 2014. Quantifying the creativity support of digital tools through the creativity support index. ACM Transactions on Computer-Human Interaction (TOCHI) 21, 4(2014), 1–25.
  • Hilary Collins. 2018. Creative research: the theory and practice of research for the creative industries. (2018).
  • Nigel Cross. 2004. Expertise in design: an overview. Design studies 25, 5 (2004), 427–441.
  • Ruta Desai, Fraser Anderson, Justin Matejka, Stelian Coros, James McCann, George Fitzmaurice, and Tovi Grossman. 2019. Geppetto: Enabling semantic design of expressive robot behaviors. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–14.
  • Andrew Dillon and Michael G Morris. 1996. User acceptance of information technology: Theories and models. Annual Review of Information Science and Technology (ARIST) 31 (1996), 3–32.
  • Graham Dove and Sara Jones. 2013. Evaluating creativity support in co-design workshops. (2013).
  • Donald P Ely. 1999. Conditions that facilitate the implementation of educational technology innovations. Educational technology 39, 6 (1999), 23–27.
  • James Fogarty. 2017. Code and Contribution in Interactive Systems Research. In Workshop HCITools: Strategies and Best Practices for Designing, Evaluating and Sharing Technical HCI Toolkits at CHI.
  • C Ailie Fraser, Tricia J Ngoon, Mira Dontcheva, and Scott Klemmer. 2019. RePlay: contextually presenting learning videos across software applications. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–13.
  • C Ailie Fraser, Tricia J Ngoon, Ariel S Weingarten, Mira Dontcheva, and Scott Klemmer. 2017. CritiqueKit: A mixed-initiative, real-time interface for improving feedback. In Adjunct Publication of the 30th Annual ACM Symposium on User Interface Software and Technology. 7–9.
  • Jonas Frich, Lindsay MacDonald Vermeulen, Christian Remy, Michael Mose Biskjaer, and Peter Dalsgaard. 2019. Mapping the landscape of creativity support tools in HCI. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–18.
  • Jonas Frich, Midas Nouwens, Kim Halskov, and Peter Dalsgaard. 2021. How Digital Tools Impact Convergent and Divergent Thinking in Design Ideation. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–11.
  • David M Gray, Steven D'Alessandro, Lester W Johnson, and Leanne Carter. 2017. Inertia in services: causes and consequences for switching. Journal of Services Marketing(2017).
  • Thomas RG Green. 1989. Cognitive dimensions of notations. People and computers V(1989), 443–460.
  • Saul Greenberg. 2007. Toolkits and interface creativity. Multimedia Tools and Applications 32, 2 (2007), 139–159.
  • Saul Greenberg and Bill Buxton. 2008. Usability evaluation considered harmful (some of the time). In Proceedings of the SIGCHI conference on Human factors in computing systems. 111–120.
  • Philip J Guo. 2015. Codeopticon: Real-time, one-to-many human tutoring for computer programming. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. 599–608.
  • Dawn Michele Jacobsen. 1998. Adoption patterns and characteristics of faculty who intergrate computer technology for teaching and learning in higher education. (1998).
  • Joseph'Jofish’ Kaye. 2007. Evaluating experience-focused HCI. In CHI’07 extended abstracts on Human factors in computing systems. 1661–1664.
  • John F Kihlstrom, Eric Eich, Deborah Sandbrand, and Betsy A Tobias. 1999. Emotion and memory: Implications for self-report. In The science of self-report. Psychology Press, 93–112.
  • Joy Kim, Mira Dontcheva, Wilmot Li, Michael S Bernstein, and Daniela Steinsapir. 2015. Motif: Supporting novice creativity through expert patterns. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 1211–1220.
  • David Ledo, Steven Houben, Jo Vermeulen, Nicolai Marquardt, Lora Oehlberg, and Saul Greenberg. 2018. Evaluation strategies for HCI toolkit research. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–17.
  • David Ledo, Jo Vermeulen, Sheelagh Carpendale, Saul Greenberg, Lora Oehlberg, and Sebastian Boring. 2019. Astral: Prototyping Mobile and Smart Object Interactive Behaviours Using Familiar Applications. In Proceedings of the 2019 on Designing Interactive Systems Conference. 711–724.
  • Younghwa Lee, Kenneth A Kozar, and Kai RT Larsen. 2003. The technology acceptance model: Past, present, and future. Communications of the Association for information systems 12, 1(2003), 50.
  • Yan Li and James R Lindner. 2007. Faculty adoption behaviour about web-based distance education: a case study from China Agricultural University. British Journal of Educational Technology 38, 1 (2007), 83–94.
  • Long-Chuan Lu, Wen-Pin Chang, and Hsiu-Hua Chang. 2014. Consumer attitudes toward blogger's sponsored recommendations and purchase intention: The effect of sponsorship type, product type, and brand awareness. Computers in Human Behavior 34 (2014), 258–266.
  • Nic Lupfer, Andruid Kerne, Andrew M Webb, and Rhema Linder. 2016. Patterns of free-form curation: Visual thinking with web content. In Proceedings of the 24th ACM international conference on Multimedia. 12–21.
  • Brooke N Macnamara and Megha Maitra. 2019. The role of deliberate practice in expert performance: revisiting Ericsson, Krampe & Tesch-Römer (1993). Royal Society open science 6, 8 (2019), 190327.
  • Justin Matejka, Wei Li, Tovi Grossman, and George Fitzmaurice. 2009. CommunityCommands: command recommendations for software applications. In Proceedings of the 22nd annual ACM symposium on User interface software and technology. 193–202.
  • Brad Myers, Scott E Hudson, and Randy Pausch. 2000. Past, present, and future of user interface software tools. ACM Transactions on Computer-Human Interaction (TOCHI) 7, 1(2000), 3–28.
  • Donald A Norman. 2004. Beauty, goodness, and usability. Human-Computer Interaction 19, 4 (2004), 311–318.
  • Donald A Norman. 2004. Emotional design: Why we love (or hate) everyday things. Basic Civitas Books.
  • Midas Nouwens and Clemens Nylandsted Klokmose. 2018. The application and its consequences for non-standard knowledge work. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.
  • Sangeun Oh, Hyuck Yoo, Dae R Jeong, Duc Hoang Bui, and Insik Shin. 2017. Mobile plus: Multi-device mobile platform for cross-device functionality sharing. In Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services. 332–344.
  • Dan R Olsen Jr. 2007. Evaluating user interface systems research. In Proceedings of the 20th annual ACM symposium on User interface software and technology. 251–258.
  • Antti Oulasvirta and Kasper Hornbæk. 2016. Hci research as problem-solving. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 4956–4967.
  • Ozgu Ozkan and Fehmi Dogan. 2013. Cognitive strategies of analogical reasoning in design: Differences between expert and novice designers. Design Studies 34, 2 (2013), 161–192.
  • Srishti Palani, Zijian Ding, Austin Nguyen, Andrew Chuang, Stephen MacNeil, and Steven P Dow. 2021. CoNotate: Suggesting Queries Based on Notes Promotes Knowledge Discovery. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1–14.
  • Ingrid Pettersson, Florian Lachner, Anna-Katharina Frison, Andreas Riener, and Andreas Butz. 2018. A Bermuda triangle? A Review of method application and triangulation in user experience evaluation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–16.
  • Christian Remy, Oliver Bates, Alan Dix, Vanessa Thomas, Mike Hazas, Adrian Friday, and Elaine M Huang. 2018. Evaluation beyond usability: Validating sustainable HCI research. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–14.
  • Christian Remy, Lindsay MacDonald Vermeulen, Jonas Frich, Michael Mose Biskjaer, and Peter Dalsgaard. 2020. Evaluating Creativity Support Tools in HCI Research. In Proceedings of the 2020 ACM Designing Interactive Systems Conference. 457–476.
  • Karen Renaud and Judy Van Biljon. 2008. Predicting technology acceptance and adoption by the elderly: a qualitative study. In Proceedings of the 2008 annual research conference of the South African Institute of Computer Scientists and Information Technologists on IT research in developing countries: riding the wave of technology. 210–219.
  • Mitchel Resnick, Brad Myers, Kumiyo Nakakoji, Ben Shneiderman, Randy Pausch, Ted Selker, and Mike Eisenberg. 2005. Design principles for tools to support creative thinking. (2005).
  • Everett M Rogers. 2010. Diffusion of innovations. Simon and Schuster.
  • Simon Roodhouse. 2006. The creative industries: definitions, quantification and practice. Cultural Industries: The British Experience in International Perspective. Online, Berlin: Humboldt University Berlin, Edoc-Server (2006), 13–32.
  • Frank E Saal, Ronald G Downey, and Mary A Lahey. 1980. Rating the ratings: Assessing the psychometric quality of rating data.Psychological bulletin 88, 2 (1980), 413.
  • Antti Salovaara, Antti Oulasvirta, and Giulio Jacucci. 2017. Evaluation of prototypes and the problem of possible futures. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2064–2077.
  • Vernon T Sarver. 1983. Ajzen and Fishbein's” theory of reasoned action”: A critical assessment.(1983).
  • Arvind Satyanarayan, Bongshin Lee, Donghao Ren, Jeffrey Heer, John Stasko, John Thompson, Matthew Brehmer, and Zhicheng Liu. 2019. Critical reflections on visualization authoring systems. IEEE transactions on visualization and computer graphics 26, 1(2019), 461–471.
  • Peter Shea, Alexandra Pickett, and Chun Sau Li. 2005. Increasing access to higher education: A study of the diffusion of online teaching among 913 college faculty. International Review of Research in Open and Distributed Learning 6, 2(2005), 1–27.
  • Renata M Sheppard, Mahsa Kamali, Raoul Rivas, Morihiko Tamai, Zhenyu Yang, Wanmin Wu, and Klara Nahrstedt. 2008. Advancing interactive collaborative mediums through tele-immersive dance (TED) a symbiotic creativity and design environment for art and computer science. In Proceedings of the 16th ACM international conference on Multimedia. 579–588.
  • Lorraine Sherry. 1998. An integrated technology adoption and diffusion model. International Journal of Educational Telecommunications 4, 2(1998), 113–145.
  • Ben Shneiderman. 1999. User interfaces for creativity support tools. In Proceedings of the 3rd conference on Creativity & cognition. 15–22.
  • Ben Shneiderman. 2007. Creativity support tools: Accelerating discovery and innovation. Commun. ACM 50, 12 (2007), 20–32.
  • Ben Shneiderman. 2009. Creativity support tools: A grand challenge for HCI researchers. In Engineering the user interface. Springer, 1–9.
  • Pao Siangliulue, Joel Chan, Steven P Dow, and Krzysztof Z Gajos. 2016. IdeaHound: improving large-scale collaborative ideation with crowd-powered real-time semantic modeling. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. 609–624.
  • Stacey H Stockdill and Diane L Morehouse. 1992. Critical factors in the successful adoption of technology: A checklist based on TDC findings. Educational Technology 32, 1 (1992), 57–58.
  • Daniel W Surry and John D Farquhar. 1997. Diffusion theory and instructional technology. Journal of Instructional Science and technology 2, 1 (1997), 24–36.
  • UX Tools. [n.d.]. 2020 Tools Survey Results. https://uxtools.co/survey-2020/
  • Klaus K Urban. 1991. Recent trends in creativity research and theory in Western Europe. European Journal of High Ability 1, 1 (1991), 99–113.
  • Viswanath Venkatesh, James YL Thong, and Xin Xu. 2016. Unified theory of acceptance and use of technology: A synthesis and the road ahead. Journal of the association for Information Systems 17, 5 (2016), 328–376.
  • James R Wallace, Saba Oji, and Craig Anslow. 2017. Technologies, methods, and values: changes in empirical research at CSCW 1990-2015. Proceedings of the ACM on Human-Computer Interaction 1, CSCW(2017), 1–18.
  • Andy Warr and Eamonn O'Neill. 2005. Understanding design as a social creative process. In Proceedings of the 5th Conference on Creativity & Cognition. 118–127.
  • Mark Weiser. 1991. The Computer for the 21 st Century. Scientific american 265, 3 (1991), 94–105.
  • Andrzej P Wierzbicki and Yoshiteru Nakamori. 2007. Creative environments: Issues of creativity support for the knowledge civilization age. Vol. 59. Springer.
  • Brent Wilson, Lorraine Sherry, Jackie Dobrovolny, Mike Batty, and Martin Ryder. 2000. Adoption of learning technologies in schools and universities. Handbook on information technologies for education & training. New York: Springer-Verlag (2000).
  • Brent Wilson, Lorraine Sherry, Jackie Dobrovolny, Mike Batty, and Martin Ryder. 2002. Adoption factors and processes. Handbook on information technologies for education and training (2002), 293–307.

FOOTNOTE

1 https://www.youtube.com

Is SketchUp still free 2022?

Trimble offers a free cloud-based version of the software for regular users, SketchUp Free, with a great selection of tools that should be appealing to beginners. The more professional and feature-filled options, SketchUp Pro and SketchUp Studio, are based on a somewhat pricey subscription model.

Is SketchUp a design software?

SketchUp was created in August 2000 as a 3D content creation tool and was envisioned as a software program for design professionals.

Is SketchUp a free software?

right in a browser. SketchUp Free is the simplest free 3D modeling software on the web — no strings attached. Bring your 3D design online, and have your SketchUp projects with you wherever you go.

How do I download SketchUp software?

Step 1: Visit the official website of SketchUp using any web browser. Step 2: Click on the windows installer and the download will start automatically. Step 3: Now check for the executable file in downloads in your system and run it.