Especially people with little design experience seem to be more frustrated during the design process. We think our findings and approach to providing real-time multifaceted feedback may be applicable to non-mobile GUI design tasks (e.g. web design). The contributions of this work include 1) characterization of the problems users face when designing GUIs by conducting semi-structured interviews, 2) design and evaluation of GUIComp that provides users with real-time, multi-level feedback for mobile GUI design and facilitate an iterative design process, and 3) lessons learned from our experience and design guidelines for GUI prototyping assistance.
GUI Prototyping Tools
Approaches for Assisting GUI Prototyping
Metrics for Measuring Visual Complexity
We use Riegler and Holzmann's metric [35] in this work, because they can present visual complexity in numerical values in real time.
Computational Method for Graphic User Interface Design
The CNN can be divided into two parts that extract features from the image and classify classes as shown in Figure 8. In a general inset tensor, the form of tensor is (number of images) x (image width) x (image height) x (image depth) unlike the example shown in the Figure 9. The deconvolution is an algorithm that is widely used in the signal processing and image processing.
Similarly, the transpose deconvolution can be represented as shown in the Figure 12 (c) as the transposition of the 3 x 3 kernel to 4 x 16 matrix and decomposition of input to 16 x 1 matrix and output to 4 x 1 matrix.
Participants
We chose the mobile GUI domain because designing mobile GUIs is a difficult design task due to 1) the innate characteristics in the domain, such as small screens with touch-based input, the increasing popularity of mobile GUIs [1, 44 ] and the lack of appropriate guidance with existing tools [13], UXPin [15], Proto.io [16].
Procedure
We then introduced the task that the participants had to perform, which was to create a GUI for an online shopping mall application that displays product information along with product images. Note that we did not impose detailed or strict design requirements, such as the color theme or product domain. The task session for each participant lasted approximately one hour, and participants were videotaped during the session.
After completing the task, each participant watched the recorded video together with the researcher and discussed the challenges the participant encountered during the GUI design process.
Identifying Difficulties Encountered by Participants
One participant reported that they had difficulty determining the optimal size of the elements (eg icons, images and buttons) for the application users. During the task session, a participant looked for GUI evaluation methods or guidelines online, but the results returned were not helpful. Four participants stated that they had no idea how to determine which areas would be most viewed by users.
Based on our observation, this difficulty seemed to lead to frequent changes in the position and size of the images.
Requirements of a Tool for Assisting GUI Prototyping
It was difficult to guess which part the customer would look for first, as all the text and images are important, I think.” Another participant said that "I want to emphasize selling points in my design, but I don't know which part is best to be the point." When we asked the participant what made her think about this emphasis, she replied that she had experienced an online shopping mall, where important information, such as coupons or product prices, were not sufficiently clear to shoppers on a mobile screen. Feedback can be considered one of the most powerful pieces of equipment to help users achieve a desired goal [52], especially for creative work [53]. Feedback given during an early design stage can enable users to iteratively improve the quality of the design [6, 19].
GUIComp is designed as an add-on for mobile GUI design applications, so it can be easily adapted to any other tool with proper configuration of client-server communication.
Tool Overview
We present a web-based companion tool called GUICompanion (GUIComp) that provides instant feedback on GUI designs. In the following sections, we describe how we captured user design states, parse web elements, and extract design elements (e.g., element types, dimensions, and colormaps), and then present the panels, metrics, and datasets used. for GUIComp.
Capturing and Processing Users’ Design on Canvas
Recommendation Panel
To achieve this goal, we calculated similarity scores between the user's design and examples so that we could rank the examples in order of similarity. When there is no component on the user canvas (eg at the start), random samples are drawn in the. So we allowed users to pin examples so that they keep the selected examples in the list; they can also unpin the examples (Figure 16 PIN.
When the user clicks on a template instance, the user canvas is cleared, and then the elements of the clicked template fill the user canvas in the same layout and alignment as shown in the template.
Attention Panel
We speculated that users might want to 'keep' some templates while changing others as they progress. When a color in the palette is hovered over, the color's RGB information is displayed to help users reference and learn color combinations in examples. To allow this automated element populating function, we first mapped the RICO sheet level node information (eg, x, y, width, height, and type) to that of the Cocoa Oven elements ( Figure 16 A3 ) and we calculated the ratio of RICO nodes to the width and height of the screen and the furnace elements to the width and height of the canvas.
Note that users can restore previous designs by using a reset function from the base tool.
Evaluation Panel
Font size and type Unit (best score: 1.0) examines the consistency of font sizes and types that are in the text. To help users understand the strengths and weaknesses of their current design compared to those in the RICO dataset (R2) [67], the evaluation panel (Figure 16 B) provides six visual complexity scores and one overall rating score for to the user's current design. For example, the sample design in the user canvas (Figure 16 A2) can be evaluated as a high-quality design based on the positions of the red bars across the distributions in each evaluation dimension (Figure 16 B).
The results of the recommended cases are displayed with black lines when the user hovers over the case in the recommendation panel.
Implementation Notes
For example, when a button element is dragged from the Elements panel (A3) and dropped onto the canvas (A2), the Evaluation panel is updated with newly calculated scores to reflect the effects of dropping the button. To evaluate GUIComp, we conducted a user study where 30 participants were asked to create two GUI designs: one with constraints and one without. We then presented the designs to online workers to assess the quality of the designs.
RQ1: Does GUIComp help users get the design closer to an acceptable design for general users than Kakao Oven.
Participants, Procedure, Apparatus, and Tasks
During the experiment, we gave participants two tasks to evaluate whether GUIComp is effective, in situations where design constraints are given by clients (eg, embedding brands [43]) and no constraints exist. The two tasks we used were a user profile interface with constraints (T1) and an item list interface without any constraints (T2). We chose the two interfaces as tasks for users with little design experience because we thought that 1) the difficulty level was appropriate, since the basic design components are provided by Kakao Oven, such as icons and buttons, and 2) the interfaces are usually required, since many applications require entering user information and displaying items in a listing interface, regardless of application categories. We excluded eye-tracking data for 6 EG participants, as the data did not contain complete prototype trials due to a malfunction.
In Session 2, we asked 21 other MTurk workers to evaluate the designs with a rubric, which provided the evaluation metrics (Figure 16 B element balance, alignment, color unit, font and element size, and density) as rating criteria.
GUIComp Users Produce Designs Acceptable to General Users (RQ1)
We chose web workers for evaluation because the goal of GUIComp is to help users produce designs that are acceptable to general users, and such relative levels of acceptability can be measured by ratings given by web workers as general mobile GUI users. We gave workers a rubric to provide minimal guidance to prevent inconsistent and subjective assessment [66]. We believe that rubric grading was appropriate as a grading guide, as it includes indices for visual complexity (eg, color, size, density, and alignment), which affects the overall quality of GUI designs [39,42,43].
Next, we present the evaluation results using the Welch's t-test and the Kruskal-Wallis test due to unequal variance in the data.
Using GUIComp Is Enjoyable, Satisfactory, and Affordable (RQ2)
Participants using GUIComp liked the feedback feature and enjoyed how their multi-layered feedback (R1) dynamically updated in real-time (R4) throughout the design process. Participant 2(P2) said: "It was nice to plan with the data (ie the feedback provided by GUIComp) because until now my intuition was the only option I could rely on during planning." No participants reported any perceived clutter or distraction from real-time GUIComp feedback during the planning process.
P1 stated that "I think it was fun seeing the feedback change in real time based on my interactions." We see no statistically significant differences between the two conditions regarding how user-friendly and easy to use and learn the tools were.
Users Employ Multi-faceted Feedback for Overcoming Difficulties in the
In post-experiment interviews, we observed multiple reasons for the positive results, including the real-time feedback provided, which enabled efficient iteration of their design processes with high satisfaction and satisfaction. We note that participants tend to use the evaluation panel more in the later design processes (eg, Figure 18 P3_T1, red parts at the bottom). We find no statistically significant difference in viewing duration between the recommendation and evaluation panels.
In summary, we believe that all feedback in the panels contributed to improving user satisfaction when using GUIComp.
Limitations
Klemmer, "Designing with interactive example galleries," in Proceedings of ACM CHI Conference on Human Factors in Computing Systems, 2010, p. Systems, 2015, p. W., "Teaching Experienced Developers to Design Graphical User Interfaces," in Proceedings of ACM CHI Conference on Human Factors in Computing Systems, 1992, p.
Klemmer, “Bricolage: example-based retargeting for web design,” in Proceedings of ACM CHI Conference on Human Factors in Computing Systems, 2011, p.