Everlytic
Email Marketing Analytics Dashboard
Everlytic is a startup in South Africa that let’s you create and send beautiful and effective email campaigns through its web based platform that ties into social and mobile messaging with rich analytics. Everlytic enables users to improve their ROI with highly targeted messages, advanced segmentation and a 360 view of their target audience.
Email Analytics Dashboard
Project
The rich email campaign analytics is what really sets Everlytic apart from its competitors like Mailchimp, Salesforce or ActiveCampaign. It was here that I realised I would have the most short term impact. The challenge was that the current implementation made the data hard to read and easily misunderstood. Users loved looking at the campaign graphs, but needed clarity on what it was they were looking at to fully make use of its power in their own businesses.
Goals
The Everlytic product is such an expansive and large scale undertaking, it was massively exciting to jump into this project as the first product designer. After spending time analysing the product, its competitors and surveying the internal departments of the business and its users, it was clear what the goals were.
The users needed a powerful platform to help them improve their business communication. The company needed a solid product roadmap and development framework to push their product forward to gain more sales and land a round of vc funding, and my goal was to create an analytics product no other email marketing service offers and provide the level of clarity and insight any marketer would dream of having.
Role
I started out as Everlytic's first UX hire. They had a team of designers in marketing but nobody taking care of the product itself. The developers building the product had initiated unintended design, which is common on an engineer driven product team.
Within a development team of 10, my role was to evangelise UX practises, build empathy for the users and drive a focus on user-centred development whilst building my own product design team to support the entire product holistically.
This entire project was accomplished with myself and a very talented software engineer, not including the various stakeholders involved.
Over the course of a few months, I delivered on the initial goals, and spent 6 months putting together a product team of 9 that contained Front-End Developers, Copywriters, UX/UI Designers and Email Designers all focused on the design and experience side of the product and successfully integrated the design team into an Agile development environment through a hybrid approach of Lean UX and Design Sprint methodologies.
Original Analytics Dashboard
The Work
Research
Diving into this project I first needed to understand many facets of the business, the product, its users and the competitors. I interviewed the CEO, Sales Team, Support Team and the Developers who initially worked on the analytics reporting features. These initial interviews are like me playing psychologist, listening to all the problems everyone faces. From them I learned the importance of each of the metrics listed in the old interface, the initial problems that users dealt with and needed support on the most, and that this interface would be a strong selling point for the sales team if done correctly. At this point I had to learn from our users themselves and went about some testing.
Unmoderated Usability Testing
Using a service called SessionCam, I performed remote usability testing and made screen recordings and documented about 400 unique sessions of users using the entire Everlytic application. This immediately opened my eyes to the stumbling blocks users faced, one of which was that users struggled to find not only the correct metrics they were looking for, but also more detailed reporting on specific things, like user activity, or link performance. Although its hard to watch people making the same mistakes over and over, it does reveal natural and valuable patterns of where usability needs to be improved on immediately.
Heuristic Review
Heuristic Reviews
I then performed a Heuristic Review based on universally recognised heuristics and focused on Functionality, Starting Points, Navigation, Search, Control, Feedback, Forms, Error States, Content, Performance & Help. This review is similar to the SUS - System Usability Scale, except that instead of only using the ten-item psychometric scale, I use a 45-point test covering the heuristics I mentioned above, made popular by the Nielsen group, showing me the overall lack of usability in the product and where easy gains could be made by making improvements.
Ethnographic Studies
Having uncovered the “What” and the “Why” questions of my challenge I still needed to answer the “How, Who & Where”, so I conducted a Remote Ethnographic study with some fairly involved users at Garmin South Africa. I asked these users to catalog a day in their lives at the office and the activities and contexts that they found themselves in. I asked them to list in detail all the tasks, actions and interactions involved in the specific tasks they had to complete on the application daily.
Simultaneously I conducted Extreme User Interviews, sampling users that were very familiar and users that were completely unfamiliar with the product. Although this was insightful, it was based on a small pool of users in a specific industry, and I needed a larger sample of users, so I conducted one last piece of research, which was a survey I sent out to roughly 3000 users. The survey dealt with Demographic, Psychographic & Ethnographic topics.
Persona Generation
Personas
All of this research combined enabled me to draw up 3 distinct personas with which to brainstorm and validate the designs that followed, and labeled them, the Artist, the Data Nerd and the Boardroom Commando. From these, the "Data Nerd - Persona" was going to be the one that I validated the reporting interface functionality on the most. Aesthetically, I needed the “Boardroom Commando” to feel its beauty & power and give the thumbs up for purchasing orders.
All of this data gathered about our users ended up not only helping us stay on track with our design and development, but I managed to help our marketing and sales teams improve their user targeting direct and online, improve ppc campaigns, develop a voice and tone and improve on brand materials.
Wireframes
Wireframing
Wireframing first happens in a Design Studio workshop I facilitate between me, the developers and the topic specialists. We do lots of high level sketching on our whiteboards, move those rough concepts to sheets of A2 paper and eventually I move all of that to my rapid prototyping tools to capture what we discussed. I wireframed this reports interface over the period of a few weeks with multiple iterations, back and forth feedback with the internal stakeholders and testing with users.
The biggest challenge was making sense of the taxonomies already present and mapping that to new mental models that would be intuitive and easy to understand.
I first separated all of the content out through an audit, elements of content strategy helped here to categorise and create modular groups of the data.
We then ran a low level card sorting exercise using these modules of content to put together an in-house folksonomy of sorts. I tested this against real users later and we iterated as was appropriate.
Prototype
UI Concepts
While I was working on the reporting interfaces I knew that either I would have to implement every screen as it becomes ready for production or leave it up to the developers to implement the designs off of my work. This meant that I would have to spec every screen diligently which would’ve added tons of hours to my plate.
Considering as well that at this point there had been no clear aesthetic direction set for the product I chose to explore a few UI concepts.
UI Concepts
After this I took a step back and reviewed the users that I was designing for, what their immediate needs were, what they were used to, the industries that I had to cater for and the amount of front-end development that would be needed to get the project off the ground as fast as possible.
I decided that the best approach would be to empower the developers to easily implement UI elements without getting bogged down with stylistic nuances. A modular, component based and readily available solution was needed. I used the Bootstrap framework, refactored some of the code and restyled it with a safer, more toned down look and feel that would achieve quicker user uptake, improved usability and create a living pattern library for the developers to use.
The framework contained the raw LESS CSS that made it relatively quick and easy to restyle the entire framework. I swapped out the default Glyph icons for the FontAwesome set and changed the base typography to Open Sans for its legibility in a UI.
The combination of a strict grid, new icons, fresh colours and clean typography helped bring this new UI to life. It might not be the most original combination of elements, but it left me with an amazing cornerstone to iterate off of going forward.
All of this took me only a week and I always aim for working code over painfully created visual documentation.
From here on out I could handover either wireframes or mockups to the developers who could implement the designs extremely accurately and fast leaving me with very little to adjust and correct before deployment.
Final UI Concept
UI Implemented through Bootstrap
The User Experience
With the UI taken care of it was time to make sure that the overall experience was being crafted and not left to chance. Targeting human emotions is always hard, but with some guidance from the field of psychology I could begin to shape an experience.
Cognitive Behavioural Strategy
I used the following Cognitive Behaviours to improve the experience from an Attention, Persuasion, Comprehension and Memory point of view.
Aesthetic-Usability Effect
With aesthetically pleasing designs often being perceived as easier to use, I was able to at least achieve perceived improvements on areas where I couldn't change workflows due to restrictions on either resources or architecture.
Familiarity Bias
People tend to develop a preference for certain things merely because they are familiar with similar things. I modelled our reports dashboard to other familiar reporting interfaces to help users feel more at home in our new UI.
Feedback Loops
People are more engaged by situations where they see their own actions modify results. I leveraged this through a new metric introduced in the reports interface that we called "Insights".
We give the user insights into their campaign messages, what makes their results good, what makes their results bad, and then provide a checklist for improvements. And every time they return, they can see an improved score based on whether they used the actionable advice or not.
Uniform Connectedness
When elements have uniform visual properties, they are perceived as being related to each other. I used this to great effect by grouping metrics that tied closely in with each other.
Similarly, Proximity and Chunking helped create the uniformity between metrics that I wanted users to associate with each other. This made it easier for users to find the metrics they were looking for.
Curiosity
When people are teased with small bits of interesting information, they are driven to find out more. By giving our users thoughtful but minimal metrics on the dashboard, I could leverage their curiosity to dive deeper into the more detailed areas of reports which previously were overlooked by users.
Competition, Status & Reputation
These three factors go hand in hand. When people share the same environment, they'll often strive to attain things that cannot be shared. They constantly asses how their interactions will either diminish or enhance their standing relative to other people and their own personal best.
People tend to care more deeply about their personal behaviours when they affect how people perceive them. These three factors combined form a fairly strong emotional bond within users. By merely adding an Industry Average score to the most important metrics in the reports, I could easily motivate users to perform better, to care more for the results they got and find pride in achieving high scores in their marketing campaigns over others.
Cognitive Behavioural Targeting
Results
We managed to implement a fantastic new reporting interface that was intelligent, intuitive and ground breaking in our space. During our design workshops the developers would challenge me a lot on what was possible within the realm of their current implementation.
Through the power of design thinking and gentle nudging I helped them dig deeper. We created a product that goes beyond the basic email campaign reporting you find in the popular consumer products.
We came up with innovative features like: the best times to send an email campaign, social interactions, content quality ratings, list engagement and link performance tracking.
This project achieved its goals. Our users were happier, we got great case studies from customers who used the new tools to run innovative campaigns, our sales team were able to consistently hit their targets and many stretch goals as the product sold itself, we secured a fantastic round of funding, were crowned South Africa’s fastest growing company and ranked 3rd on the Deloitte Fast 50 list for tech companies in Africa.
After I established my product design team we went even further and implemented long term campaign management, drip campaigns, A/B split testing, trigger based autoresponders and a drag-and-drop email builder.
Learnings
It’s always good to look back at a project on what went well and what didn’t and try learn from those experiences. Even though we reached our business goals, the goals were very fuzzy and on the qualitative side only. We had no hard measure of success and no clear baseline to work off of. This makes iterating on those successes very hard.
Had I not faced any restrictions I would have wanted to track both quantitative and user experience metrics on the original product to set a clear baseline and benchmark my redesigns against. Similarly I performed a lot of qualitative research in the beginning of the project, but didn’t get the chance to do it after to get a new reading.
In the absence of hard metrics to measure against it becomes increasingly important to test your assumptions as early and often as possible.
The more time you spend in the details of a project the harder it becomes to challenge your own assumptions. You have to always keep an open mind, test with real users and learn at every opportunity available.
Back to Top