Back
Enhance your creativity.

Simplicity and adaptability, a new dialogue for design.
Introduction
What is A.L.L.I.E?
This project is a conceptual AI design assistant in Adobe Illustrator, aiming to make the design process be efficient, explore more, and help validate ideas. We followed a 7 week process, incorporating machine learning UX, user research, human-centered design, agile design thinking, wireframing, and prototyping.
Team
Justin Catalano
Anna Fang
Aaron Tran
Luke Witty
Anderson Zhou
My role
UX lead
User Interviews
Visual Design
Journey Map
Product Map
Feature Development
Wireframing
Prototyping
We were asked
What if AI could be trained specifically for you?
Your styles, preferences, approach, etc.
How might that work and what could you do with it?
Who and what?
We interviewed 8 design students to grasp their design process, there was pain in
Efficiency
Personalization
Help
How can A.I. help?
Our research guided our ideation to figure the solutions in
Workflow
Customization
Feedback
Humane interaction.
What type of interaction can we make that creates trust with a computer brain
Clean & simple
Out of the way
Conversation
Asset Generator
The AI-driven asset generation function enables designers to produce usable assets that can be edited within Adobe Elements. It offers a variety of shapes, layouts, fonts, and colors to spark creativity and encourage experimentation.
Asset Generation active based on input questions, pulls generation from user data
Vector Generation, generates manipulatable vector based on the input, four variations
Shape Generation, generates any shape and can be manipulated, four variations
Color Generation, four different color variations of work in art board
Ff
Font Generation, four different font variations related to what the user asked
Image Generation, generates four different images based on user input
Composition Generation, takes whatever is on the art board and re-arranges, four variations
History, shows everything generated
Feedback Experts
This feature interacts with designers to answer questions and provide step-by-step guides. It also gives technical recommendations based on standards, audience, and accessibility. Designers can turn it off if needed.
Experts active based on input questions, pulls from accessibility and trusted resources
Suggestion Expert, watches your workflow and gives advice on how to better it
Design Principle Expert, evaluates design and shows how principles can be used
Audience Expert, predicts how a certain group of viewers will interpret design
Accessibility Expert, explains how to make a more accessible design
Compositional Expert, provides guidance and feedback on the current composition
History, shows everything generated
Concept Prototype
Here is the prototype created using Figma. It has a tutorial to teach the AI tools, and then follows a design process going from a blank canvas to a full design, then getting validation. FYI: Because this is more of a storytelling prototype, use the yellow bar to progress through. You can select different generations, but only one of them will move you onto the next step.
Learn more!
A brief on A.I.
Machines streamline human behavior and tasks thanks to artificial intelligence (A.I.).
Machine learning (ML) is a branch of A.I. that uses techniques that allow computers to learn from inputs given.
Multi-layer neural networks are used for computation in deep learning (DL), a type of machine learning.
These neural networks train computers to preform tasks faster than humans.
SWOT analysis
We looked at different art-focused A.I. Products that are currently on the market. These were: Open A.I.'s DALL-E and Chat-GPT, Canva's Magic Write and Text-to-Image, and Replika's A.I. companion. Click image to access the SWOT analysis.
Initial goals
How might we develop an AI tool that will help Adobe programs become more accessible?
How might we use AI to create a solution that can help the design process move faster?
How might we create an AI tool that can make the learning process easier for new designers?
User Research
We want to target the next life-time users
College students (18-23) of all majors, ranging from beginner to experienced with Adobe design programs.
Getting personal
In-person interviews with 8 college students who use primary use Adobe programs
What was the learning process like when you first started using Adobe?
“It definitely took a while to get used to everything and understand the main function because of how many options and freedom Adobe programs can offer”
"I feel frustrated because when there is a specific problem I need a solution to, I can never find it in Illustrator or online."
Do you think AI could help you learn Adobe Creative Cloud programs better?
“Yeah, definitely. Like if there was some kind of AI assistant that could give me tips and stuff based on my specific needs, that would be really helpful.”
Do you think AI could help you learn Adobe Creative Cloud programs better?
Unanimous
“YES!”
What are the issues?
We gathered our research to understand the main pain points.
“I am a designer who often spends significant time creating and exploring different assets so that my designs are effective. I wish that I could fully explore different aspects of the design, but I have found that I either get a creative block or faced with time constraints.”
“As a new designer to the Adobe suite, I feel overwhelmed and uncertain about how to use the tools and what techniques to use. I want to build more skills so that I produce the best of my ability. I try to use online tutorials and I struggle with the back and forth, and I don't feel like I will be able to retain that information.”
Do you think AI could help you learn Adobe Creative Cloud programs better?
“Yeah, definitely. Like if there was some kind of AI assistant that could give me tips and stuff based on my specific needs, that would be really helpful.”
Influential themes were
Learning curve, Time constraints, Understanding tools
Places to maximize are
Efficiency, Personalization, Easier learning process, Guidance, Help
How might we use A.I. to
In exploration, efficiency, and application of their ideas
Improve their engagement when using Adobe services
Acquire personalized guidance, support, and workflows
How can we solve these issues?
After researching the technical side of M.L. and how users, we can now solve users needs.
Workable Asset Generation
How can we make generating work usable?
Live display
Customizable
Easy to use
Feedback Experts
What do we need to have informative feedback?
Provides help
Live guidance
Personalization
Interaction points
What does the interface need for a user to be successful?
Human input
A.I. output
Overlaying
Empathy Mapping
What is the user journey?
We created current and future state journey maps for the Workable Asset Generation and Feedback Experts to understand the experience and emotions of the user. Click the image to access the Journey Map.
How does this work?
We created the architecture of how the Workable Asset Generation and Feedback Experts would work together. Click the image to access the Information Architecture in a new tab.
Lets build!
The design process for this project took 5 rounds that focused on the idea, the design, and the feedback. After feedback was received, a new round would start until the deadline of the project. It was important to validate our ideas after each round so that the final deliverable would be based on what the users need.
First Round
Understanding the connection.
We chose to make the interaction a chat, since the goal is to make a design assistant. This would be represented by a type in chat-box and a voice input. There are two entrance points and two different ways of interaction. The functionality map was a rough draft about how the system would work. There were changes made to it later down the process. Click the image to access the figjam document.
Starting the flow.
Building out the wireframes:
We decided to build out a simple single case use of the main points of interaction. This brought to light the challenge of making something complex, simple. Click the image to view.
A familiar approach.
What do you want?
After presenting this to our users, they didn't like it. They wanted to see
How the AI talks back, where that is and what it looks like
An easier interaction
The UI more of an overlay and its own element
Second Round
Let's take a turn, and see where it goes!
We wanted to create an interface applicable to all Adobe programs. We prioritize the generator and feedback functionality while removing tutorials. The main goals were to define the generator and feedback features for designers' needs, make the interface engaging, and explore designs that balance modularity and uniqueness.
Think modular.
A new interaction
The way you would interact with this interface will be by still using the voice or chat entrance, but this time what ever you asked would be displayed in a main area. The generated information was displayed in a text message format, to show communication. There were new affordances added by changing the icon based on feature and highlighting which function was active.
What do you think now?
We received a better reaction, but there's more.
We shouldn't stray to far away from Adobe's style guide
How can you make this a dialogue?
The layout is very useable and friendly
Third Round
Updating the UI
We chose to make the interaction a chat, since the goal is to make a design assistant. This would be represented by a type in chat-box and a voice input. There are two entrance points and two different ways of interaction. The functionality map was a rough draft about how the system would work. There were changes made to it later down the process. Click the image to access the figjam document.
New Allie icon, opens and closes
New voice activation
Dynamic text island, yay movement
Finalizing the flow
New wireframes:
This round of wireframe was more intense than the last one. The feel for how our prototype would eventually work was needed, so it was made. There was now a clearer image into how some of the main functions would work and what that would look like. Click the image to access the Low Fidelity Wireframe page.
Did we get it now?
We received a better reaction, but there's more.
Further the chat interaction.
What does the motion look like?
How would multiple functions work at the same time?
Fourth Round
A more finalized design starts to emerge
What about Experts!
We had a flash of insight
During one of our last feedback discussions, a revelation of how to incorporate the Feedback feature was made. That was to make "Experts" on design topics through machine learning. After we had this idea, we looked back at our research to see what areas designers needed help with. It was also going off of our original feedback concepts. The A.I. Experts offer a place to answer design questions that would usually take time to get, speeding up the design experience.
Last time, we promise.
It was time to get our final user testing, how do you feel now?
Minor design details
Further develop Feedback Experts
Think throughly about the functionality for the prototype
Asset Generator
The AI-driven asset generation function enables designers to produce usable assets that can be edited within Adobe Elements. It offers a variety of shapes, layouts, fonts, and colors to spark creativity and encourage experimentation.
Asset Generation active based on input questions, pulls generation from user data
Vector Generation, generates manipulatable vector based on the input, four variations
Shape Generation, generates any shape and can be manipulated, four variations
Color Generation, four different color variations of work in art board
Ff
Font Generation, four different font variations related to what the user asked
Image Generation, generates four different images based on user input
Composition Generation, takes whatever is on the art board and re-arranges, four variations
History, shows everything generated
Feedback
This feature interacts with designers to answer questions and provide step-by-step guides. It also gives technical recommendations based on standards, audience, and accessibility.
Experts active based on input questions, pulls from accessibility and trusted resources
Suggestion Expert, watches your workflow and gives advice on how to better it
Design Principle Expert, evaluates design and shows how principles can be used
Audience Expert, predicts how a certain group of viewers will interpret design
Accessibility Expert, explains how to make a more accessible design
Compositional Expert, provides guidance and feedback on the current composition
History, shows everything generated
Highlights
Concept Prototype
Here is the prototype created using Figma. It has a tutorial to teach the AI tools, and then follows a design process going from a blank canvas to a full design, then getting validation. FYI: Because this is more of a storytelling prototype, use the yellow bar to progress through. You can select different generations, but only one of them will move you onto the next step.
Behind the scenes.

Okay, really the last time
How did you feel about this final prototype
Overall engaging - good examples of the commands and picture show tools
Good step by step for creating assets
Understood the story of interaction, ease of access


Conclusion
Our next steps would be
Would like to be able to deselect the tools being used to remove elements
Have the option to manually switch between feedback and asset generation
Create deeper interactions within functions that use non-Allie features.