Back

Enhance your creativity.

Simplicity and adaptability, a new dialogue for design.

Introduction

What is A.L.L.I.E?

This project is a conceptual AI design assistant in Adobe Illustrator, aiming to make the design process be efficient, explore more, and help validate ideas. We followed a 7 week process, incorporating machine learning UX, user research, human-centered design, agile design thinking, wireframing, and prototyping.

Team

Justin Catalano

Anna Fang

Aaron Tran

Luke Witty

Anderson Zhou

My role

UX lead

User Interviews

Visual Design

Journey Map

Product Map

Feature Development

Wireframing

Prototyping

We were asked

What if AI could be trained specifically for you?

Your styles, preferences, approach, etc.

How might that work and what could you do with it?

Who and what?

We interviewed 8 design students to grasp their design process, there was pain in
Efficiency
Personalization
Help

How can A.I. help?

Our research guided our ideation to figure the solutions in
Workflow
Customization
Feedback

Humane interaction.

What type of interaction can we make that creates trust with a computer brain
Clean & simple
Out of the way
Conversation

Asset Generator

The AI-driven asset generation function enables designers to produce usable assets that can be edited within Adobe Elements. It offers a variety of shapes, layouts, fonts, and colors to spark creativity and encourage experimentation.

Asset Generation active based on input questions, pulls generation from user data

Vector Generation, generates manipulatable vector based on the input, four variations

Shape Generation, generates any shape and can be manipulated, four variations

Color Generation, four different color variations of work in art board

Ff

Font Generation, four different font variations related to what the user asked

Image Generation, generates four different images based on user input

Composition Generation, takes whatever is on the art board and re-arranges, four variations

History, shows everything generated

Feedback Experts

This feature interacts with designers to answer questions and provide step-by-step guides. It also gives technical recommendations based on standards, audience, and accessibility. Designers can turn it off if needed.

Experts active based on input questions, pulls from accessibility and trusted resources

Suggestion Expert, watches your workflow and gives advice on how to better it

Design Principle Expert, evaluates design and shows how principles can be used

Audience Expert, predicts how a certain group of viewers will interpret design

Accessibility Expert, explains how to make a more accessible design

Compositional Expert, provides guidance and feedback on the current composition

History, shows everything generated

Concept Prototype

Here is the prototype created using Figma. It has a tutorial to teach the AI tools, and then follows a design process going from a blank canvas to a full design, then getting validation. FYI: Because this is more of a storytelling prototype, use the yellow bar to progress through. You can select different generations, but only one of them will move you onto the next step.

Tech Research

A brief on A.I.

Machines streamline human behavior and tasks thanks to artificial intelligence (A.I.).

Machine learning (ML) is a branch of A.I. that uses techniques that allow computers to learn from inputs given.

Multi-layer neural networks are used for computation in deep learning (DL), a type of machine learning.

These neural networks train computers to preform tasks faster than humans.

SWOT analysis

We looked at different art-focused A.I. Products that are currently on the market. These were: Open A.I.'s DALL-E and Chat-GPT, Canva's Magic Write and Text-to-Image, and Replika's A.I. companion. Click image to access the SWOT analysis.

Initial goals

How might we develop an AI tool that will help Adobe programs become more accessible?

How might we use AI to create a solution that can help the design process move faster?

How might we create an AI tool that can make the learning process easier for new designers?

User Research

We want to target the next life-time users

College students (18-23) of all majors, ranging from beginner to experienced with Adobe design programs.

Getting personal

In-person interviews with 8 college students who use primary use Adobe programs

What was the learning process like when you first started using Adobe?

“It definitely took a while to get used to everything and understand the main function because of how many options and freedom Adobe programs can offer”

"I feel frustrated because when there is a specific problem I need a solution to, I can never find it in Illustrator or online."

Do you think AI could help you learn Adobe Creative Cloud programs better?

“Yeah, definitely. Like if there was some kind of AI assistant that could give me tips and stuff based on my specific needs, that would be really helpful.”

Do you think AI could help you learn Adobe Creative Cloud programs better?

Unanimous

YES!”

What are the issues?

We gathered our research to understand the main pain points.

“I am a designer who often spends significant time creating and exploring different assets so that my designs are effective. I wish that I could fully explore different aspects of the design, but I have found that I either get a creative block or faced with time constraints.”

“As a new designer to the Adobe suite, I feel overwhelmed and uncertain about how to use the tools and what techniques to use. I want to build more skills so that I produce the best of my ability. I try to use online tutorials and I struggle with the back and forth, and I don't feel like I will be able to retain that information.”

Do you think AI could help you learn Adobe Creative Cloud programs better?

“Yeah, definitely. Like if there was some kind of AI assistant that could give me tips and stuff based on my specific needs, that would be really helpful.”

Influential themes were

Learning curve, Time constraints, Understanding tools

Places to maximize are

Efficiency, Personalization, Easier learning process, Guidance, Help

How might we use A.I. to

In exploration, efficiency, and application of their ideas

Improve their engagement when using Adobe services

Acquire personalized guidance, support, and workflows

Conceptualizing

How can we solve these issues?

After researching the technical side of M.L. and how users, we can now solve users needs.

Workable Asset Generation

How can we make generating work usable?
Live display
Customizable
Easy to use

Feedback Experts

What do we need to have informative feedback?
Provides help
Live guidance
Personalization

Interaction points

What does the interface need for a user to be successful?
Human input
A.I. output
Overlaying

Empathy Mapping

  • Say
    “I thought there would be more people with the same level of knowledge as me in this program.”

    “I’m sorry I keep asking for help.”

    “I’m so stressed, there are so many tools in Illustrator.”

    Do
    “I am constantly looking up the keyboard shortcuts for Illustrator.”

    “I am spending extra time on weekends to learn the tools.”

    “I make a lot of mistakes in my class assignments.”

    Phineas, 2nd year student
    Student switching from studying Business in his undergraduate program to Design. Doesn't know a lot about design.

    Think
    “How ahead are my classmates in terms of Adobe skills?”

    “Why is this taking so long!"

    “Is there a quicker way than watching tutorials on YouTube to learn Adobe programs?”

    Feel
    “I feel discouraged as my peers have a lot more knowledge and experience in Adobe than me.”

    “I feel stressed with my pile of design assignments that I have to spend about 2 hours on each.”

  • Say
    “I will try using this feature to see if it will help with my design.”

    “I don't remember what the shortcut is for this function.”

    “I need to get this assignment done on time.”

    Do
    “I should research tutorials for features I'm not familiar with to implement.”

    “I should ask my professor for tips on how I can approach my design work.”

    “I keep trying different solutions, but it's still not working.”

    Michelle, - 3rd year student
    She has experience with Adobe programs, but she would like to further develop her skills to become better.

    Think
    “What was the shortcut for this function?”

    “Which tool should I use to solve this problem?”

    “How do I find the right solution to solve this specific problem?”

    Feel
    “I feel frustrated by how much work I need to get done.”

    “I feel overwhelmed by how long it takes me to get my work done.”

    “I feel confused about what step to take in order to work how I want it to be like.”

  • Say
    "I have no idea where to start."

    "I do not know how to use a lot of these functions and tools."

    I need to complete this design assignment."

    Do
    "I should search online for tutorials when I am stuck on something."

    "I should learn the basic functions to have a general understanding."

    "I am constantly making mistakes."

    Robert, - 3rd year student
    He is a college student learning how to use Adobe programs for his design classes. He is fairly new to using them.

    Think
    "I wonder if I am doing this right"

    "Is there an easier way for me to do this?"

    "I dont know how to create this. what should I do?"


    Feel
    "I feel overwhelmed with all the different features and functions."

    "I wish there was a faster and easier way to learn all this stuff."

    "I wonder if there is a different way for me to go about this problem."

What is the user journey?

We created current and future state journey maps for the Workable Asset Generation and Feedback Experts to understand the experience and emotions of the user. Click the image to access the Journey Map.

How does this work?

We created the architecture of how the Workable Asset Generation and Feedback Experts would work together. Click the image to access the Information Architecture in a new tab.

Design Process

Lets build!

The design process for this project took 5 rounds that focused on the idea, the design, and the feedback. After feedback was received, a new round would start until the deadline of the project. It was important to validate our ideas after each round so that the final deliverable would be based on what the users need.

First Round

Understanding the connection.

We chose to make the interaction a chat, since the goal is to make a design assistant. This would be represented by a type in chat-box and a voice input. There are two entrance points and two different ways of interaction. The functionality map was a rough draft about how the system would work. There were changes made to it later down the process. Click the image to access the figjam document.

Starting the flow.

Building out the wireframes:

We decided to build out a simple single case use of the main points of interaction. This brought to light the challenge of making something complex, simple. Click the image to view.

A familiar approach.

  • Generator Interface

    The generator would activate based on whatever you typed or said. It would then be displayed under the corresponding icon. This interface would act as a main hub.

    1

    Text box, entry point

    2

    Creates a design from scratch

    3

    Vector generation

    4

    Color generation

    5

    Shapes generation

    6

    Rearranges composition

    7

    Images generation

    8

    Fonts generation

    9

    Voice input

  • Feedback Interface

    The Feedback would be given after typing or speaking. The information would be displayed under each tab, having this interface act as a main hub. We also were testing a word only display.

    1

    Recommendations

    2

    Efficiency steps

    3

    Generated design feedback

    4

    Generated audience reception

    5

    Match designers style

    6

    Compares previous work

    7

    Set your own perimeters

    8

    Additional resources

    9

    Voice input

  • History

    The history would be presented as each function does something different. All points of history would be accessible and the user could re-use it if they wanted to.

    1

    Search through history

    2

    Shows generation

    3

    Voice input

  • Library

    You would upload your own work into the library so the A.I could generate similar designs. This could be done easily and would work similar to the Adobe Cloud's Library system.

    1

    Search through your library

    2

    Shows designs

    3

    Voice input

What do you want?

After presenting this to our users, they didn't like it. They wanted to see

How the AI talks back, where that is and what it looks like

An easier interaction

The UI more of an overlay and its own element

Second Round

Let's take a turn, and see where it goes!

We wanted to create an interface applicable to all Adobe programs. We prioritize the generator and feedback functionality while removing tutorials. The main goals were to define the generator and feedback features for designers' needs, make the interface engaging, and explore designs that balance modularity and uniqueness.

Think modular.

A new interaction

The way you would interact with this interface will be by still using the voice or chat entrance, but this time what ever you asked would be displayed in a main area. The generated information was displayed in a text message format, to show communication. There were new affordances added by changing the icon based on feature and highlighting which function was active.

  • Drop down menu

  • Assest gen active

  • Feedback active

  • Multiple generators on

  • Example interaction

What do you think now?

We received a better reaction, but there's more.

We shouldn't stray to far away from Adobe's style guide

How can you make this a dialogue?

The layout is very useable and friendly

Third Round

Updating the UI

We chose to make the interaction a chat, since the goal is to make a design assistant. This would be represented by a type in chat-box and a voice input. There are two entrance points and two different ways of interaction. The functionality map was a rough draft about how the system would work. There were changes made to it later down the process. Click the image to access the figjam document.

New Allie icon, opens and closes

New voice activation

Dynamic text island, yay movement

Finalizing the flow

New wireframes:

This round of wireframe was more intense than the last one. The feel for how our prototype would eventually work was needed, so it was made. There was now a clearer image into how some of the main functions would work and what that would look like. Click the image to access the Low Fidelity Wireframe page.

Did we get it now?

We received a better reaction, but there's more.

Further the chat interaction.

What does the motion look like?

How would multiple functions work at the same time?

Fourth Round

A more finalized design starts to emerge
  • Small drop down menu

  • Main start

  • Conversation bubbles

  • Display of generations

What about Experts!

We had a flash of insight

During one of our last feedback discussions, a revelation of how to incorporate the Feedback feature was made. That was to make "Experts" on design topics through machine learning. After we had this idea, we looked back at our research to see what areas designers needed help with. It was also going off of our original feedback concepts. The A.I. Experts offer a place to answer design questions that would usually take time to get, speeding up the design experience.

Last time, we promise.

It was time to get our final user testing, how do you feel now?

Minor design details

Further develop Feedback Experts

Think throughly about the functionality for the prototype

Introducing,
A.L.L.I.E.

Asset Generator

The AI-driven asset generation function enables designers to produce usable assets that can be edited within Adobe Elements. It offers a variety of shapes, layouts, fonts, and colors to spark creativity and encourage experimentation.

Asset Generation active based on input questions, pulls generation from user data

Vector Generation, generates manipulatable vector based on the input, four variations

Shape Generation, generates any shape and can be manipulated, four variations

Color Generation, four different color variations of work in art board

Ff

Font Generation, four different font variations related to what the user asked

Image Generation, generates four different images based on user input

Composition Generation, takes whatever is on the art board and re-arranges, four variations

History, shows everything generated

Feedback

This feature interacts with designers to answer questions and provide step-by-step guides. It also gives technical recommendations based on standards, audience, and accessibility.

Experts active based on input questions, pulls from accessibility and trusted resources

Suggestion Expert, watches your workflow and gives advice on how to better it

Design Principle Expert, evaluates design and shows how principles can be used

Audience Expert, predicts how a certain group of viewers will interpret design

Accessibility Expert, explains how to make a more accessible design

Compositional Expert, provides guidance and feedback on the current composition

History, shows everything generated

Highlights

  • Main starting interface

  • Generator after multiple functions in use

  • Feedback after multiple functions in use

  • Exit phase

Concept Prototype

Here is the prototype created using Figma. It has a tutorial to teach the AI tools, and then follows a design process going from a blank canvas to a full design, then getting validation. FYI: Because this is more of a storytelling prototype, use the yellow bar to progress through. You can select different generations, but only one of them will move you onto the next step.

Behind the scenes.

Okay, really the last time

How did you feel about this final prototype

Overall engaging - good examples of the commands and picture show tools

Good step by step for creating assets

Understood the story of interaction, ease of access 

Conclusion

Our next steps would be

Would like to be able to deselect the tools being used to remove elements

Have the option to manually switch between feedback and asset generation

Create deeper interactions within functions that use non-Allie features.

Final thoughts

This project really pushed my understanding of making conceptual interactions. Since we didn't make an actual A.I., it was interesting to explore what was possible since there were really no limits. It was great working with Jason and his help in directing us towards a solution. We all had a great time and spent many hours together. It was a great time and very fun!

Designed by Justin Catalano

DuneSystems

2024 ©