Whether it's guerrilla testing, feedback surveys, analytics data, purchasing stats, or good old fashioned online reviews, there's always a way to find out how people feel using your web products. That data is gold.
I'm no expert when it comes to conducting research, but will always find a way to get it done. Connecting with users is simply too important a step to skip!
This is where I take the findings and map out some opportunities they present. I think this is best done as a collaborative exercise. No one person comes up with ALL the good ideas in life, so get all the people and all the ideas out in the open!
After a bit of impact-effort mapping, and depending on the business needs, this usually leaves you in a good place to start designing some specific solutions with specific goals.
Yay, design time! It depends on the team I'm part of as to how involved I get here - sometimes I'm there as a designer, other times I'm purely in an engineering role. Either way, at the very least I like every project to have a well thought out journey map and some low-fi wireframes.
If working with external stakeholders, or if a complete redesign is being proposed, it can be better to take more time here and flesh out really detailed designs. But often that's not necessary and basic designs will do.
When I say build here, it doesn't always have to be in code. Sometimes building a clickable prototype in a design tool that you can take out to users is the best thing you can possibly do. It all depends on the project.
The point of this step is to make something testable. That's it.
All this means is test it out. Get the new thing out in front of people and track their reactions.
Remember those emotions and specific goals back at the start? That's what you measure against here. Does the new thing improve the experience of your users? Does the new thing meet business goals?
I've done this via in-person focus groups and A/B tests (or A/B/C/D tests!), using tools like Hotjar and Google Optimize, but there's loads of ways to do it.
Sometimes you get lucky and the validation step proved your solution is 100% effective, job done. That's rarely the case. There's almost always a few small tweaks that can improve it.
This is often the stage where final accessibility and usability improvements come in, for example.
Make sure you review from an internal perspective too. What went well, what didn't? Can we refine the solution further? Any improvements that could go into the next sprint?
This is all about teamwork and getting in the best position possible to start the next phase of work.