Reading in progress. Notes under construction.
Click-level data. Self-hosted or external solution.
Must tie outcomes to profits of your report recipients
- Increase revenue
- Reduce cost
- Improve customer satisfaction / loyalty
I believe that most websites suck because HiPPOs created them. HiPPO is an acronym for the “Highest Paid Person’s Opinion”.
Failing online is cheap and fast.
Surveys, lab usability testing, remote usability testing, etc.
See that you are in a race.
Multiplicity strategy. Utilize various tools.
We tend to pick tools like we are picking a marriage partner. When we choose
wrong, we don’t want to accept it.
Want reporting / analysis
Have IT / business strength (or both)
1.0 / 2.0
Visit = Session = A collection of requests
- Persistent cookie
- Can be influenced by browser that reject (third-party) cookies. First-party cookies 2% ~ 5%. Third-party cookies 10% ~ 30%
- Daily UV is useless if you are looking at a period of more than one day. Because many web analytics vendors won’t compute truly de-duped Absolute UV for you.
- Problem of computing time on last page:
- Multi-tag situation: normalize to a single session
Definition: percentage of sessions with only one page view.
- Measure Bounce Rates for your website’s top referrers. This tells you who your BFFs are.
- Meausre Bounce Rates for search keywords.
Blogs are a unique beast amongst online experiences: people mostly come only to read your latest post. They’ll read it, and then they’ll leave. Your bounce rates will be high because of how that metric is computed, and in this scenario that is OK.
Bounce Rate equates to people taking absolutely no action on your site. If you make an “excuse,” I’ll push back because I don’t fundamentally believe for any site—for-profit or nonprofit—that success is a one-page view.
- Use with Bounce Rate to make sense.
- In structured experiences, distinguish it with Abandonment Rate to improve customer experience.
Choosing V or UV? Different business situation
Ugly and useless findings. Two types:
- Not accepting the limtis of possible.
- Hide what is actually being measured.
Ditto for folks who define Engagement as the number of repeat visits by a visitor. In this past week, I visited www.lenovo.com eight times because Lenovo decided to stop supporting System Update. I was stressed and frustrated because I had to locate drivers for my ThinkPad X301 by using its suboptimal internal site search engine! How do you distinguish those visits from someone who visits the Lenovo site regularly to learn about the latest products and updated features?
It’s important to know that if you must overlay your own opinions and interpretations to understand the metric, then you might be on the wrong road.
Quantitative data is limited in that it can measure the degree of engagement but not the kind of engagement.
Some possible options for measuring the kind of engagement:
- Inline / on-exit surveys: Direct / Indirect (e.g. likelihood to recommend)
- Primary market research
- Customer retention over time
If you are the only person who understands the metric or the key performance indicator, then you have just guaranteed that your company will not take action.
Don’t sexify, uncomplexify.
A few years back, I interviewed at one of the biggest companies on the Web. They had just closed their quarter, and it had been tremendously profitable. I asked them the reasons for that great success. The following anecdote is 100 percent true:
Them: “We just kicked off the query against our data warehouse; it typically returns the results in three months.”
Me: Stunned silence.