Creating personalized meals with data: A Q&A with Daily Harvest Chief Algorithms Officer, Brad Klingenberg

It is becoming increasingly difficult to standardize taste.

The myriad culinary preferences and gastric demands of the American population are reflected in the $997B valuation of the U.S. packaged food market in 2020. There has also been a push in recent years to augment trips to the grocery store with at-home meal kits and food delivery services, a trend further accelerated by the onset of quarantine restrictions. Between various companies, cuisines, and dietary options, how does someone choose what’s best for them?

One company has a solution. Daily Harvest, a digital native, direct-to-consumer company that provides personalized meals from healthy, plant-based sources, crafted by data collected from the user. This process allows for each customer to fine-tune their palate and diet to their liking. Personalizing meals for customers requires accurate algorithms, processing power, keen analysis, and a commitment to privacy.

Brad Klingenberg leveraged his experience with personalized data at Stitch Fix to become the Chief Algorithm Officer at Daily Harvest. Brad sat down with Cindi Howson on The Data Chief to discuss how Daily Harvest leverages algorithms and data to personalize meals for their customers, how to protect customer data, and how to avoid bias when creating ethical AI. Read on for more.

CH: Personalization in a digital world can be both convenient and invasive. How do you balance both aspects?

BK: I think it's important to have an alignment where the customer wants you to get to know them. At Daily Harvest, people are excited to share their experiences and what they love in order to help them find more things they're going to like in the future; they help us co-create the food with them. Daily Harvest sources the ingredients directly from farms. From there, we can create anything that we want. Our job is to create food that people are going to love, so customer feedback is integral to helping us create their food. This creates a virtuous alignment in sharing the benefits between the customer and the company.

CH: When customers share their data with you, do you share it with anyone else?

BK: We're proud to be good stewards of customer data. Daily Harvest does not have a business model to sell that data to other people. The data's principal use is to help improve the experience of individual customers. It also serves to help innovate our culinary portfolio. We use data to improve the service for people in a way that tangibly improves their own experience.

There's an increasing awareness in privacy and how data is shared across companies and for purposes that customers might not intend. I think the idea of sharing with a company that has a value proposition around getting to know you and helping you find things that you love is quite different. That's why we see people so excited to share with us, both through feedback built into the product, and also across all sorts of social media channels. People are excited for us to know them better.

CH: There's also the process of operationalizing the algorithms, which I would assume is key to the personalization. Is that right?

BK: Absolutely. It's sometimes useful to think of there being a hierarchy of the ways that you can use data. The first thing you need to do is have the data. Next, you need to be able to look at it and interrogate it in different ways to learn insights and useful applications. Beyond that, being able to use it in an automated way plays an integral role in digital products and businesses.

I think a lot of data science is about not tricking yourself. It’s about making sure you're identifying causal relationships and thinking about ways that structures, patterns, and data might lead to the wrong decisions. I think it fits in nicely with a larger tradition of being critical about the way we use data and the correlations that might exist in data that we don't want to perpetuate or are not useful in the task at hand.

CH: What do you think about AI ethics and biases in algorithms?

BK: An increasingly important topic. I think it is quite easy for bias to be introduced through the underlying data that's used to train an algorithm or through the general process around development or evaluation. I think we'll start to see some good practices and methodologies emerge to think about what we can do to combat bias. It's a hard problem given that a lot of it comes from the data that people are using.  I think there's an emerging toolset to help evaluate and diagnose bias. When running an experiment, it's quite easy to look at aggregate results and miss disparate impacts across different subgroups. There are some settings where that might be a small problem, but there are also settings where that's a big problem. I think having better methods to first measure and then minimize the impact of that is something that's developing and will be important in the future.

In other settings, I think it behooves companies in different industries to do a good job recognizing how different algorithms impact different people. I don’t think it's something that needs to be mandated, but I think as data scientists become more aware of the potential downside, they can do things to help fix it. There will be instances where it's in the best interest of the algorithm developer to minimize bias. 

The key to making algorithms successful is that the devil is in the details. It’s making sure things are adapted to the particular structure of the problem you’re working on and the business you're working in.

Personalizing the customer journey with data and algorithms

ThoughtSpot is working with companies like Daily Harvest to leverage customer data to create a more personal and unique experience. Sign up for a free trial of ThoughSpot to see how you can personalize and improve your customer experience.