New Yorkers tend to work through lunch

East coast people like to work through lunch

It’s been a few months since we’ve launched our new Gmail Meter platform and we’re very thankful for your support and feedback during our critical early stages. In the last couple of months, we have stabilized the platform and optimized our implementation of Google APIs. Finally, we are now ready to start sharing with you some of what we’re learning about email usage.

Introductory disclaimer

Before doing so, we want to be clear about the nature of the data and metrics that we will be presenting you today. We understand that email is an important communication vehicle that needs to remain secure. As the company that provides some of the world’s largest email providers with the most scalable email and contacts data import system available, we really do know how vital email is and rest assured in our utmost commitment to securing your data.

With that said, the following statistics are aggregations of the data used to generate your Gmail Meter reports. In the process of aggregating the data for the following analysis, we exclude the authentication information and thereby completely disassociating the data from the account.

During data aggegation, we have also isolated users who have a and and excluded their data from the following numbers. This done under the assumption that any other domain name would be a custom domain name, which are used for business purposes and provide a better look at how our users email for work.

In addition, we do not store any of the end user data used to generate the reports and all such end user data is processed entirely in memory. We only store our generated calculations, primarily to be able to provide continued access to previous reports but also for potential future product changes that may require such data.

The following insights are merely guidelines and standards to compare oneself by. Because we try our best to keep the minimum amount of data on our users, we cannot define them any further than by general timezones. All geographic locations are suggested and approximated from the locale setting of the Google account. We currently do not have any plans to collect any data based on roles of our users and therefore cannot analyze to that granularity.

A few overall numbers and general trends

In this first of our series, we wanted to focus primarily on some of our largest populations – namely our users from the Central, Pacific, and Eastern timezones. And since there is just so much data to cover, today we’ll only be looking at habits regarding sending emails.

First, several general trends appear to apply to most of our population across all three timezones:

While there is a definite and significant decline in outgoing emails overnight, outgoing emails are sent out at every hour of the day for every timezone. Whether this is driven by the general population or by a small subset of outliers will have to wait until we can implement additional calculations into our analysis. Even so, this trend does speak to the general effect of globalization.

Of particular note for each timezone are the average hours of operation. While users from all timezones typically start their day around 6 or 7 AM, timezones can vary substantially in the ways that emails are sent throughout the day. Please also keep in mind that this data is the daily average for each month since July averaged again across months. We understand that these averages can fluctuate over different times of the year.

Below are overall statistics for each timezone, numbers are rounded for the simplicity’s sake.

Data on Gmail usage

US Central Timezone

We definitely wanted to take a look first at our own home timezone here in Chicago, which also has the smallest sample size of the three populations we’re looking at today.

  • While our users in the Central timezone send out the fewest number of emails compared to either coast, they still send out slightly more emails than the global average.
  • In addition, users from the Central timezone on send more emails per each recipient.
  • And Central users mostly send emails internally, implying that Central timezone users may rely on email for having longer conversations, potentially primarily with fellow team members.

US Central hourly sent emails

A look at the hourly histogram shows that the number of outgoing emails drops off dramatically after 4 PM, in accordance with the typical 8 hour work day. This is contrary to users from either coast as they appear to work longer days.

US Pacific Timezone

The second largest of our three user populations, our users from the Pacific timezone send out the most emails out of the three. This should come at no one’s surprise as adoption of technology is more prevalent on the west coast than anywhere else in the US. Our Pacific timezone users should also be commended for minimizing the amount of emails sent internally, potentially relying on other methods for internal communication (Slack being the obvious possibility).

US Pacific timezone sent emails by hour

Similar to Central timezone users, users in Pacific timezone have two separate peak times for sending out emails with the second peak being the larger of the two. A big difference, besides sending more emails overall, is that our Pacific timezone users email more regularly outside of the typical work hours. The overnight peak between the hours of midnight and 1 AM likely coincides with the beginning of the business day for GMT/UTC, implying that users on the west coast are more likely to collaborate internationally.

US Eastern Timezone

Our Eastern timezone users sit in the middle in terms of number of emails sent out and perform almost exactly according as the global average when it comes to emails sent internally versus externally. Interestingly, they also send on average the fewest numbers of emails per recipient.

US Eastern timezone sent emails by hour

To be perfectly clear, our title should actually refer to all users from the Eastern timezone instead of specifically New Yorkers. The above histogram looks a bit different from the two that we’ve already looked at. While the rate does slow down around lunchtime, there doesn’t appear to be a lunch break! In fact, there’s a steady and gradual increase in the number of emails through lunch – a phenomenon we have yet to find in any other timezone of users.

US Mountain Timezone

We didn’t want to spend too much time discussing our Mountain timezone users simply because we currently have very few of them. But the preliminary data appears to show that our users living the Rockies are some of the most prolific email users, and may actually send more emails than any other timezone.

US Mountain timezone sent emails by hour

But without more users, we won’t know how significant any of the data – and especially chart above – we’ve presented today is. As with most of statistics, early data can be very volatile and trends will stabilize as new data is collected. If you’re not currently a Gmail Meter user, please try our Gmail analytics tool to see how you compare to the users we’ve discussed today.

We’ll be publishing more interesting observations and trends in the future, with the hopes of getting a better understanding of how we all manage our emails and if there really is a better way to do so.

As always, we would love to hear your feedback. Please never hesitate to reach out to us via or by tweeting us @GmailMeter. We also look forward to bringing you more news on some improvements we have in the works for Gmail Meter, please keep an eye out for updates!

Thanks for reading.

The Importance of Analytics for your Web Application

Last month, we were excited to announce the formation of the Gmail Meter team. Since then, our team jumped right into the analytics (or lack thereof) and we immediately realized how important it was to have these analytics in place before considering things like growth or new product features.

We have a pretty strong commitment to freedom of information and transparency, so in this first series of blog posts, we plan on sharing some of the lessons that we are learning as we progress in our endeavor to bring you the best Gmail analytics tool possible!

Some of these posts will be more technical, others will be more related to customer development, and still others will be focused almost entirely on numbers.

From the title, it should be obvious that today we will be thinking about numbers – many, many numbers. Don’t expect this post to have any detailed how-to’s on things like how to setup Google Analytics, filter out ghost spam, or use Google Tag Manager to track button clicks. These will come later; in the meantime, please feel free to signup below to stay updated!

Instead, please view this as a broad overview on why in-app analytics are absolutely important to building a good product and how these analytics will help inform you on important product decisions.

How are users behaving?

Even with all the time we’ve already spent over the course of the last couple of months, we still feel that we have barely scratched the surface of Google Analytics.

Setting up Google Analytics is very simple and straightforward. All that is required is the addition of a simple Javascript snippet in between the tags to get started, but there are just so many ways to customize analytic triggers and data presentation. It can be difficult to even see from a glance the actual potential of Google Analytics.

Beyond the baseline metrics that are calculated by default, the real power of Google Analytics is the ability to find out what users are doing at every step of their experience with the app. Getting these statistics does require a bit of additional time and set up, but have no worry we’ll be covering those steps in a later post. In addition, there are many articles, both Google- and third-party-provided, that can assist with this process.

I’ll be breaking down the rest of this post according to the different stages of our user flow, roughly correlated to the AARRR method for simplicity’s sake.

Optimizing signup flow and activation friction

Before we took the time to set up these additional metrics, we had only been logging the email addresses of every user who had initially authorized the Gmail Meter script. While this method is the obvious choice for simple implementation, it did not provide an accurate or full picture of our users’ initial signup experience.

One of the first things we did was to link our Google Analytics account with Google Tag Manager and we set up a custom event trigger that tracked the clicks to our signup button. After just one day of data collection, we were quite shocked to discover that there was a large discrepancy between the number of users who initially clicked that button and the number of email addresses collected during our signup process.

Realizing just how little we knew about how our users progress through the signup flow, we decided that we need to take it a step further. We set up yet another custom event to capture the actual number of users who complete the entire signup process and land on our “Thank You” page.

Once we got this data, we knew that we had to go even one final step further. Originally, before getting any of this data, our “Thank You” page had actually included a button for first time users to generate their initial Gmail Meter report. Suspicious that this additional step may have actually created friction for activation, we decided to also track how many users actually requested their first Gmail Meter report.

As you can probably already guess by now, the results were a bit shocking.

The data presented below is just a sample of the total data, specifically this is for the period between January 20th through January 26th.

Of the total users who clicked the “Get My Report” button –

  • 82% of users completed the initial authorization and entered into our internal database
  • 67% of users actually requested their first report to be generated

So this means that from first step of our signup process (“Get My Report” click) to activation (actually being generated their first report), 29% of our users were dropping off.

Conclusions drawn based on analytics

After making these discoveries, we knew that we had to immediately shift priorities to address the shortcomings in our signup and initial activation flow. Previously, we had been focusing on different growth strategies like SEM and additional distribution channels to bring more users to our landing page.

With this data collection set up, we knew that we could make a significant impact in the number of our activated users by focusing on reducing friction for signup and initial activation. In addition, by digging deeper into the data, and separating out each step of our signup flow, we were able to identify and pinpoint the exact steps that needed reworking.

For example, one very easy decision to make was to automate the generation of a user’s first Gmail Meter report. Only 82.1% of users who had completed our signup process actually clicked to generate their initial report. With automating this one step, we are now guaranteeing that 100% of our signed up users are also activated users (barring any bugs or glitches, which is a story for another day).

This decision may seem intuitive and obvious to us now, but without the right analytics or data we would still be in the dark about our signup flow. Now, if there’s a single takeaway for you, reader, it’s that the right data can and should quickly inform your decision making process, and change priorities.