Tools

Internship at Gramener: Journey of Pranav Vadrevu with our Product Team

Reading Time: 5 mins

Pranav Vadrevu is pursuing an undergraduate Computer Science coop degree at the University of Waterloo. He joined team Gramener for an internship and worked with the product team. We interviewed him about his experience through the internship. Here’s what he has to say.

Q – How did you plan to work as an intern at Gramener?

The coop program at my University embeds work experience with my course work. At the end of my first year, I had my first work term as part of the coop program. During this term, I worked as an intern with the product team at Gramener.

Q – Is this your first internship or have you been familiar with the domain?

Before the work term began, I didn’t have much experience in the web development area. Needless to say, I was eagerly waiting to get a taste of work experience. Since this was my first ever work experience, I didn’t have the slightest idea about the work I’d be given.

Luckily, Gramener’s Product SVP – Sundeep Reddy Mallu, created a structured plan of tasks for me to execute. Tejesh, Gramener’s Data Science Manager, was my buddy to help me understand my tasks.

Q – What was your usual day like during your internship at Gramener?

The working hours were flexible and I usually logged in from 10 am to 5 pm. The flexibility allowed me to Work from home a few times. The team gave me all the essentials such as a work laptop, ID card, and work account within the first week.

Q – What tasks did you work on?

Typically, my daily routine involved coming to the office and spending the first half of the day learning the tools that I needed for the tasks. I generally spent the second half of the day integrating my learning into the tasks. Whenever I was stuck, Tejesh helped me work through it. I had weekly touchpoint calls with Sundeep to discuss and show my progress and ideas.

Q – Can you talk about your tasks in detail?

Sure. My first task, for 2 weeks during my internship at Gramener, was to assimilate and interweave data analysis and data storytelling on the 2019 India General Elections dataset.

1. Working on Election Dashboard

Gramener has already built a dynamic dashboard for Republic TV on this data. So, I needed to construct a visual data story. It included replicating Gramener’s work to make myself comfortable with the process of creating a data application. I worked with the Python pandas module to perform data analysis. With the help of Tejesh, I paved in a direction for the data story.

Over the next week, Tejesh helped me chart the visuals that I designed using Vega and G1 Map viewer. I used HTML, CSS, Bootstrap, and Javascript to design the application’s frontend. For the backend, I relied on Gramex, Gramener’s rapid-application builder platform.

By the end of the task, here’s how the dashboard looked:

2. Working on InsightsFeed

My next task was to work on the InsightsFeed. It’s a web app that takes any dataset and delivers quality visuals and insights. It helps rapid consumption of insights from the dataset. The idea was brilliant. The backend architecture for designing visuals was already implemented.

Using my Pandas knowledge from Python, I was working on the backend architecture. We wanted to add features such as insight statements, outlier detection, computing ratios, finding surprising extremes across multiple columns, and identifying time-series data. We added some basic data cleansing support on the backend since we rarely ever find ourselves with clean datasets. The work was for a couple of weeks and I wanted to explore something different.

I segued from the InsightsFeed onto creating a web app based on one of my favorite pastime activities – Movies. If you’re interested in knowing the screentime of characters for various movies, then this app is for you.

I began exploring Node and created an Amazon Prime Scraper that scrapes X-ray metadata of all the movies on Amazon Prime Video. X-ray data is a JSON file that has metadata of a movie. I worked with Puppeteer, a Node Library, to extract these X-ray files. Furthermore, I engineered a web application using Node on the backend that allowed users to search for movies and get its screen time visuals. The screentime visuals were done with Vega.

3. End-to-End Data Project (Startup scene)

After my segue, it was time to get back on track for my unfinished task. It was to create an end-to-end data project on any topic of my liking. After fiddling around with multiple topics, I settled on India’s Startup Scene. However, I’m still working on it.

To be more creative, I used D3, another library for creating visuals, but much less restrictive than Vega. I wanted to explore the visuals and ideas of dashboards that I hadn’t before. So, I created a circle-packing graph to show the hierarchical layout of the Startup Market. I also created a dynamic bar chart visual with filters. Here’s the dashboard.

Apart from my tasks, I helped with a few mini-tasks. For instance, I did some data validation for the company for a PoC. I also worked on the Scales module of G1, the company’s micro-framework collection for frontend applications. Also, I attended my first ever design jam at the Facebook office. It gave me insights into how to interweave various startups and elements of design and privacy to create applications.

Q – Did you get a chance to participate in the Data Storytelling Hackathon at Gramener?

Yes. Another highlight of my internship was the Data Storytelling Hackathon.

Our team participated and designed a physical data visualization on the screen time visual of a Telugu movie called “Jersey”. Although it was taxing to watch this movie about 7 times to collect the screen time data, it was a leisure project that served as a spark of inspiration for my screen time web app. Below is the physical visualization:

Q – What learning are you taking back from your internship tenure?

During my internship, I gained an understanding of prominent tools used for web development. Furthermore, I learned languages and environments such as JavaScript (Libraries: D3, Vega), Node (Libraries: Express, Puppeteer), and Python (Libraries: Pandas and Seaborn).

But the highlight of the internship at Gramener wasn’t just using the tools. It was the exposure I got from working in an organization. You can pick any tool, but understanding the bigger picture about how an organization functions is something I picked up through my experience at Gramener.

Gramener - A Straive Company

Gramener – A Straive company is a design-led data science firm. We build custom Data & Al solutions that help solve complex business problems with actionable insights and compelling data stories.

Leave a Comment

View Comments

  • Very fascinating Pranav! You sound like you had tons of fun during your internship.
    Cheers,
    Denis.

  • You will be a multi billionaire when you are older! Congratulations on being the first person to get 45 points in the ISB.

Share
Published by
Gramener - A Straive Company

Recent Posts

Enhancing Warehouse Accuracy with Computer Vision

In today’s fast-paced world of e-commerce and supply chain logistics, warehouses are more than just… Read More

4 days ago

How AI is Redefining Quality Control and Supercharging OEE Optimization?

What does it mean to redefine the future of manufacturing with AI? At the heart… Read More

3 weeks ago

How is AI Transforming Cold Chain Logistics in Healthcare?

In 2022, Americans spent USD 4.5 trillion on healthcare or USD 13,493 per person, a… Read More

4 weeks ago

How Can CEOs Slash GenAI Costs Without Sacrificing Innovation?

In the rush to adopt generative AI, companies are encountering an unforeseen obstacle: skyrocketing computing… Read More

1 month ago

Top 7 Benefits of Using AI for Quality Control in Manufacturing

AI in Manufacturing: Drastically Boosting Quality Control Imagine the factory floors are active with precision… Read More

1 month ago

10 Key Steps to Build a Smart Factory

Did you know the smart factory market is expected to grow significantly over the next… Read More

1 month ago

This website uses cookies.