This is part 1 of a 4-part series on The Making of Manhattan’s Coffee Kings.
For the Gramener data storytelling hackathon this month, I planned to tell the story of which coffee store covered the highest population in Manhattan.
This was not an accidental choice. I wanted something that’s map-based. I wanted to merge census data into maps. I also wanted something interesting enough to be marketable.
I read about the fight Starbucks has been facing from the likes of Dunkin’ Donuts and McDonald’s, and decided to write about it.
I had 4 days before the hackathon, so the planning had to be meticulous. I checked with the hackathon team and confirmed that scraping the data and pre-planning the story was within the rules, as long as I executed the story on the day of the hackathon.
The plan was as follows:
I could do all of these except the design and Power BI implementation. I’m not good at either. So I asked for help. My colleagues Devarani & Navya volunteered. Forming the team early helped a lot since we could plan well.
Luckily, we followed the plan almost perfectly, except that “Create the maps” took 2 days instead of 1, so we had only 1 day to “Write the story”. But that didn’t cause a problem.
Getting the Manhattan census map and population data didn’t take much time. Both were on NYC.gov.
Scraping the stores’ location data took a bit of time. Using Chrome’s Developer Tools > Network., I found that McDonald’s store locator uses an API at https://www.mcdonalds.com/googleapps/GoogleRestaurantLocAction.do
It accepted a radius=
and maxResults=
parameter. But increasing both, I was able to write a quick script that fetched the 1,000 closest stores within a 50 mile (or is it kilometer?) radius.
Starbucks’ store locator was trickier. The API returns stores around a ?lat=
and a ?lng=
, but there was no obvious way to increase the number of stores. It also does not allow direct access. It must be accessed via XHR.
Rather than solve this elegantly, I just manually moved the store locator map around to cover all of Manhattan and noted the locations. Then I created a script that collates all of these JSON responses together.
I tried scraping Dunkin’ Donuts’ website but got an Access Denied. It let them know, but it’s still down. So I assumed that they won’t last long, and moved on with just two stores.
At this point, we had all the data and were ready to create the maps.
In today’s fast-paced world of e-commerce and supply chain logistics, warehouses are more than just… Read More
What does it mean to redefine the future of manufacturing with AI? At the heart… Read More
In 2022, Americans spent USD 4.5 trillion on healthcare or USD 13,493 per person, a… Read More
In the rush to adopt generative AI, companies are encountering an unforeseen obstacle: skyrocketing computing… Read More
AI in Manufacturing: Drastically Boosting Quality Control Imagine the factory floors are active with precision… Read More
Did you know the smart factory market is expected to grow significantly over the next… Read More
This website uses cookies.
Leave a Comment