Create briefs that empower agency partners and level up your campaigns. Download our Ebook here →

Back

Season 3, Episode 59

Transforming Data for Educators with AI with Ben Dodson, Co-founder & CEO of Doowii

Featuring Ben Dodson, Co-founder & CEO of Doowii


In this week’s episode of Tech Can’t Save Us, hosts Paul David and Maya Dharampal-Hornby are joined by Ben Dodson, co-founder and CEO of Doowii. 

Helping teachers harness AI to better understand and address their students’ needs, Doowoi helps non-technical users access data they previously couldn’t, improving workflows and decision-making.

Tune into this episode now to hear Ben discuss the problems with AI plagiarism detectors, how personalise data science improves student outcomes, and how what he learnt about work culture at Google and Snapchat.

Find more about Doowii here: https://www.doowii.io/

Episode Transcript

LH (02:13.865):

Hello, everyone. Welcome to a new episode of the Tech Can’t Save Us podcast brought to you by Literal Humans. As always, I am your co-host, Paul David.

 

Maya Dharampal-Hornby:

And I’m your co-host, Maya Dharampal-Hornby. Today, we’re joined by Ben Dodson, co-founder and CEO of Dewey, that’s D-O-O-W-I-I. We’re going to talk about transforming data for educators with AI, which is something near and dear to my heart. Ben, I don’t know if you know this, but I’m a former teacher and educator for about a decade in the States. Welcome to the pod, Ben!

 

Ben Dodson (02:41.486):

Awesome! What did you teach?

 

LH (02:43.913):

I taught English and history in eighth grade in Philadelphia, and I worked in several different school districts, including New York and a couple in California. I even taught in a prison at one point.

 

Ben Dodson (02:55.63):

Wow, that’s awesome. I didn’t know that. Thanks for having me on the podcast; I appreciate it.

 

LH (03:01.193):

We’re excited to have you! Just to introduce you, Ben, you run Dewey, a generative AI platform specifically built for educators. I would have loved to have it when I was in the classroom. Did you want to say something about Dewey?

 

Ben Dodson (03:12.986):

Sure! Dewey puts educational insights from attendance, grades, and feedback all at the fingertips of its users. I co-founded Dewey after working in data science for an impressive roster of tech companies, like Google and Snapchat. I eventually wanted to pursue my own project. We launched Dewey last year, and last month we closed our latest fundraising round, securing $4.1 million. Congrats to us!

 

LH (03:50.286):

That’s amazing! Was that a seed round or a Series A?

 

Ben Dodson (03:55.598):

It was essentially a seed round. We had a first close on that round late last year and continued to roll on SAFEs until we officially called it around April or May of this year. So it was a larger seed than we initially anticipated, but in this environment, it was a solid round to give us more runway to execute at this stage.

 

LH (04:31.145):

That’s a really solid seed round, especially in this environment. Kudos for that! For our audience, could you explain what a SAFE is? Some folks might not know.

 

Ben Dodson (04:41.262):

Sure! SAFE stands for Simple Agreement for Future Equity. It allows investors to come in without having to price the company, which can be difficult at our stage. Instead, they can invest money at either a conversion discount or what’s called a valuation cap, which is what we did. This means that in a future priced round, they’ll receive shares at a favorable price compared to what we anticipate for that round. Essentially, it involves less paperwork than a priced round, enabling investors and founders to move quicker. SAFE was introduced by Y Combinator about 15 years ago and has become the de facto standard.

 

LH (05:41.929):

And YC refers to Y Combinator for those listening. Thanks for that investment education to kick us off, Ben! Our audience is now better informed about the investment ecosystem. While you’re still in education mode, could you explain what Dewey does?

 

Ben Dodson (06:02.862):

Absolutely! The genesis of Dewey came from recognizing that significant advancements have been made in data science over the past several decades. During my career in big tech, I saw these improvements in predictive analytics and data processing. However, at the end of the day, you still have to pay data scientists hundreds of thousands of dollars, regardless of whether they’re writing a thousand lines of code or just a hundred. The theory was that even though progress has been made in data science, it wasn’t translating well for educators. With generative AI, we aimed to create an AI data scientist that pulls together various components of data science. It can write 100 lines of code here and there to emulate the ability of data scientists to query, analyze, visualize data, and run predictive models. This way, we provide a personalized data scientist for every educator, allowing districts and universities to avoid investing millions in modern data architecture and teams, while still accessing those capabilities at a fraction of the cost.

 

LH (07:38.473):

Thank you. As we’ve covered, securing $4.1 million in seed funding demonstrates the market’s appetite for EdTech right now. Could you tell our listeners when generative AI first became used in education and what data gathered can offer to educators?

 

Ben Dodson (08:01.742):

Sure! Generative AI is relatively new to education. It probably gained wide recognition around the launch of ChatGPT. Prior to that, very little generative AI was being utilized or recognized as such. There were perhaps some features generating content on the back end for certain companies, but no one was adopting a generative AI strategy or even recognizing it as a strategy. Now, it’s a hot topic of discussion across districts and universities regarding its application.

 

The use of data really depends on the district’s attitude toward AI adoption and what they consider to be AI. Most districts don’t necessarily distinguish between AI and generative AI or its applications. A lot of the adoption is still being figured out, especially for tools like ours that require heavier integration processes. In contrast, tools like ChatGPT or other teacher tools allow teachers to sign up independently. These tools are seeing faster adoption because teachers can implement them immediately without waiting for district decisions. Many educators are willing to pay out of pocket to make their lives easier without relying on the district to provide these resources.

 

LH (10:16.425):

To set the scene, some of the drive for implementing generative AI in education is fueled by real progress. A report from EdTechX showed that classrooms using AI-driven interactive tools saw a 20% increase in student participation and a 15% increase in knowledge retention rates. Another study from McKinsey indicated that AI in education, including generative AI, reduced administrative workloads by about 30%, allowing educators to focus more on teaching and student interaction. I would have loved that when I was in the classroom; there was so much grading and administrative work that took me away from my students’ learning. It’s exciting to see these early gains, and you’re taking that further with Dewey. I appreciate the work you all are doing.

 

Ben Dodson (11:04.782):

Absolutely! Thank you. However, I would approach those statistics with skepticism. As a former teacher, you know that the challenges faced in Philadelphia or New York differ significantly from those in a thousand-student district in Iowa or Texas. Not every district in the U.S. is dealing with the same issues or operates in similar ways. The adoption of a particular tool can look quite different from one state to another.

 

I question how EdTechX is obtaining those statistics. For example, how are they monitoring something like student participation? What’s the sample size? Additionally, those studies likely measure short-term gains. What is the long-term impact? Are we seeing a novelty effect from introducing an AI tool into the classroom regarding participation? Does that novelty last over time? While AI shows a lot of potential, I think it’s wise to remain skeptical given the early stage of adoption.

 

LH (12:27.017):

Definitely. It would be good to follow up with TechX on that. You’re framing the current moment for generative AI in education very well. A Stanford study showed that 80% of educators acknowledge the benefits of AI, but 70% are concerned about potential biases in AI-generated content. You’ve also mentioned the flaws of AI content detectors gaining widespread use in schools. We’d love to hear your reservations on that specifically, as it’s a hot topic in AI right now.

 

Ben Dodson (13:02.894):

I wouldn’t trust any of the AI detectors regarding their accuracy. I don’t trust a lot of the statistics they provide. I’ve read their papers, and they often test several cohorts using different methodologies, selecting the testing cohort that yields the highest metrics while pretending that it applies broadly. For example, when testing English-written essays at the university level, you might achieve 1% accuracy. However, if you introduce a subpopulation of English as a second language learners, that accuracy drops considerably because their writing tends to be more atypical. This makes it harder to detect whether the text is AI-written or not.

 

The accuracy of AI detector tools varies by domain, whether by subject or student population, and it’s constantly changing. The models students use are often the latest releases from OpenAI or Anthropic. So, if an AI testing company claims 99% accuracy based on testing with GPT-3.5, that metric becomes almost irrelevant over time.

 

Ben Dodson (15:28.334):

Fundamentally, AI detectors are often transformer models or machine learning models that extract signals from a body of text. If a signal exists to differentiate between human and AI writing, that same signal can also be used by OpenAI or Anthropic to improve their models. Their goal isn’t necessarily to catch cheating; it’s to make AI-generated text sound more human and genuine.

 

LH (15:58.153):

And less detectable, right? Eventually, those models will catch up with the detectors, rendering the detectors somewhat moot.

 

Ben Dodson (16:04.43):

Exactly. If there’s a detectable signal, there’s also a signal to train and improve on. This creates a constant arms race. On one side, you have OpenAI with billions in funding, and on the other, you have specialized AI detectors with far fewer resources. It raises the question: who’s winning that arms race?

 

LH (16:22.409):

Yeah, it’s a detection issue.

 

Ben Dodson (16:36.014):

Moreover, there’s the moral issue of telling a student that our AI detector thinks they are cheating. What happens with false positives? What really matters is not just the overall accuracy but also the precision of these tools.

 

LH (16:46.505):

Right, right. No bad guy, finish that point.

 

Ben Dodson (17:05.038):

You could detect half of the cheaters pretty accurately, but if that comes at the expense of 20% false positives, we need to consider what’s an acceptable rate. Should we accept falsely accusing someone 20% of the time just to take action on the 80% who are cheating? That seems fundamentally unfair. I haven’t seen this issue addressed in university policies or how people apply these AI detectors. So, generally, I think it’s a bad idea.

 

LH (18:02.441):

Absolutely. I currently tutor a girl who submitted an English essay that was flagged as 98% likely to be AI-generated. She was devastated, and her mom rallied support against the school. Fortunately, they redid the essay, and it became clear she did not use ChatGPT or any generative AI. But for kids without supportive parents, this could seriously harm their motivation and ability to progress in their work.

 

Ben Dodson (18:58.126):

Yes, absolutely. To give you a sense of the unreliability of these systems, OpenAI had an AI detector that they removed because it wasn’t reliable enough. Another company, Hive.ai, also shut down their AI detector after a few months. They didn’t specify the reasons, but it seems there were similar concerns.

 

LH (19:34.217):

Do you see us moving toward a point where these AI tools become as accepted as Microsoft Word? Initially, there was resistance to using word processors instead of writing by hand, but over time, using tools like spell checkers became standard. Do you think we’ll see a similar acceptance of AI tools in the next few years?

 

Ben Dodson (20:08.142):

Absolutely. Many analogies can be drawn to technological shifts that change how we think about different mediums. For instance, the introduction of calculators in math or how photography transformed art movements like Impressionism and Cubism. I think something similar could happen with writing essays. Instead of teaching the standard five-paragraph essay to every student, we should explore what novel human elements AI can’t replicate and embrace AI as part of the medium.

 

While photography made art more realistic, it also allowed for new artistic expressions. Similarly, we need to identify the human element that complements AI tools. Those who advocate for completely banning AI are resisting an important change that will be part of our lives moving forward. Just as it doesn’t make sense to say we shouldn’t use computers at all, resisting AI tools is counterproductive.

 

LH (22:18.281):

Right. It’s about integrating these technological changes into existing practices. Given your skepticism about the EdTech X stats, could you outline the risks of overhyping generative AI’s current role and value in education?

 

Ben Dodson (22:48.334):

Are you asking if I think it’s overhyped?

 

LH (22:51.209):

Yes, and if so, what are the risks?

 

Ben Dodson (22:58.35):

I do think there’s some overhype because people are trying to predict AI’s capabilities across the board, not just in education. As we progress from GPT-3 to GPT-4 and beyond, some believe that future models will resemble AGI, which I dislike as a term because it’s ambiguous.

 

There are camps within AI that are enthusiastic about AGI and others that are skeptical. For example, Yann LeCun, who has led Facebook’s AI team, has stated that while Facebook’s public position is to work toward AGI, he personally disagrees with the term, viewing it as nebulous and not particularly useful for meaningful discussion. Those using the term often do so as a marketing strategy rather than for applied AI.

 

LH (24:44.969):

Interesting. There’s a lot of debate about when we’ll reach AGI and how we’ll know when we’re there. It’s food for thought. I want to get to the point about the widespread use of generative AI and how data-driven education has evolved. Inside Higher Ed reports that 50% of higher education institutions in the U.S. are using predictive analytics to improve student retention and graduation rates.

 

Ben Dodson (24:52.91):

Yeah.

 

Ben Dodson (24:59.694):

Mm-hmm.

 

LH (25:13.097):

McKinsey conducted a survey showing that educators are using AI-powered tools to enhance personalized learning experiences. Halon IQ found that 45% of educational institutions use AI-driven analytics for optimizing resource allocation and staffing budgets. This points to AI taking on some administrative functions. What key trends do you see shaping your company’s direction as it engages in the generative AI movement?

 

Ben Dodson (25:44.462):

I’ll discuss this from both the industry perspective and our approach at Dewey. The industry is looking at a broad range of problems to determine if they can be addressed with AI. Some issues have proven quick and effective for AI to solve. For example, many educators use ChatGPT for various tasks, from writing emails to generating lesson plans.

 

Some of these applications are starting to face pushback and scrutiny. If you instruct ChatGPT to create a lesson plan based on specific standards, it will do so. However, whether it actually meets those standards is questionable. This is part of the hallucination problem that generative AI faces. The severity of the problem depends on how badly the AI hallucinates and whether an administrator is monitoring the teacher’s adherence to standards. Sometimes it might be close enough, but if it blurs the boundaries of what should be legally accurate, that’s a significant issue.

 

Ben Dodson (28:08.494):

There’s increasing scrutiny on how AI tools are applied, despite the enthusiasm for their potential benefits. From our perspective, we don’t see our platform as primarily a generative AI platform; rather, we view it as a data platform that utilizes generative AI to enhance accessibility and interoperability. We aim to solve data problems rather than focusing solely on generative AI.

 

Our platform combines features of systems like Tableau and Databricks, integrating generative AI as a core component of the architecture. We are designing an AI operating system on top of a modern data stack. Many districts and universities face longstanding data issues that require traditional data engineering solutions. This includes managing large amounts of structured and unstructured data, ensuring data interoperability, and producing customized reports.

 

Generative AI is one part of the solution, helping to create custom reports more quickly and easily. However, there are still significant backend plumbing issues that need to be resolved for everything to function smoothly. We’re not just applying generative AI to data; we’re also integrating various technology components into the education space.

 

LH (30:25.993):

I appreciate that answer. There’s been a lot of discussion about generative AI for its own sake because it’s the latest trend. What you’re doing is using generative AI as a tool to improve educational outcomes in K-12 and higher education. Your platform helps with better processing of student information, managing classroom rosters, resource allocation, and enhancing student learning outcomes. These are critical aspects of education, and generative AI simply facilitates better performance. I think your positioning in the market is smart and will hopefully attract many customers. Kudos for that.

 

Ben Dodson (31:14.766):

Thanks, yeah, I hope so as well.

 

LH (31:16.809):

In terms of providing the architecture for educators to use AI responsibly, as Paul mentioned, in an outcome-driven way, I’d love to hear feedback you’ve received from educators using your platform.

 

Ben Dodson (31:34.958):

I think the feedback falls broadly into two main domains. First, users often say that tasks that used to take a week now only take a few seconds to a couple of minutes. Previously, to compile a data report, I had to go through multiple systems, download CSVs, and work in spreadsheets. This would take three to four hours over the course of a week. Now, I can just type a natural language command into the platform and get exactly what I want back, or about 90% of what I need, within seconds.

 

The second type of feedback is from users who say they had no idea how to access certain data. They might have needed this information for a specific purpose. Without our platform, they wouldn’t have been able to figure out how to obtain that data. These two use cases address different problems: one is regular workflows or standard reporting that administrators or faculty need weekly, while the other involves unique, time-sensitive issues requiring quick data access. Our platform provides straightforward solutions for both scenarios.

 

The way we’re building our platform emphasizes acting as an AI data scientist. It’s not just about generating reports; it’s also about engaging with users’ questions regarding the report. We aim to explain the metrics and break them down in various ways. We tailor our explanations to the user’s level of technical understanding, creating a more dynamic process.

 

My experience as a data scientist showed me that when you give someone a report, they often have follow-up questions. They might ask about an odd spike on a chart or seek clarification on the data presented. Having a platform that can address those follow-up questions is powerful.

 

LH (34:27.145):

Absolutely. To further illustrate that second use case, could you provide an example of a unique problem that might arise and how your platform helps solve it?

 

Ben Dodson (34:39.31):

Sure! For example, one of our early customers experienced teacher strikes. We were able to analyze the behaviors of the teachers both before and after the strike, monitoring certain actions they should and shouldn’t have taken. For instance, we could see if teachers deleted courses, which would prevent students from accessing them.

 

We also looked at the impact on students, providing a granular timeline view of specific metrics. This allowed the school to clearly identify the effects of the strike and address behaviors more quickly. Normally, these types of reports wouldn’t be needed or top of mind on a weekly basis, but in this case, there were specific metrics that warranted immediate review.

 

LH (36:07.433):

Interesting. I have a fun question for you, Ben. I’m attending a talk later tonight by Sal Khan, the founder of Khan Academy. If you could ask him one question about his role, especially given that the talk is titled “How AI is Revolutionizing Education,” what would it be?

 

Ben Dodson (36:14.286):

That sounds cool.

 

Ben Dodson (36:19.246):

Haha.

 

Ben Dodson (36:26.414):

I know.

 

That’s a…

 

Ben Dodson (36:34.766):

Yeah, definitely. That’s a good one. One of the most interesting developments in Generative AI is the push for more personalized learning. I would say Sal Khan and Khan Academy have been at the forefront of this with Khanmigo. If you saw their recent demo with GPT-4, their ability to take in multi-modal inputs as a personalized AI tutor looked impressive. If I had Sal Khan in a one-on-one conversation, I might ask how much of that demo is real, as I have some skepticism about whether GPT-4 performs as smoothly as they showcased.

 

A more confrontational question might be about how they’re improving the guardrails for Khanmigo. I’ve seen issues where users attempted to get Khanmigo to say certain things, like asking it to explain how to start a meth lab, which raises concerns about safety and appropriateness.

 

LH (38:16.585):

Yeah, I think that’s a valid point. Can you explain what Khanmigo is for our audience?

 

Ben Dodson (38:24.814):

Sure. Khanmigo is the AI tutor persona that Khan Academy has released. Many school districts are piloting it as a personalized one-on-one tutoring solution, which has been a longstanding goal in education. The challenge has always been how to scale one-on-one tutoring cost-effectively, and Khanmigo aims to address that. However, Sal Khan has raised questions about implementing the right level of guardrails, especially since Khanmigo operates on top of GPT-4. While GPT-4 has its own guardrails, they may not be sufficient for young children.

 

There are ways to bypass these guardrails, and I’ve seen instances where users successfully prompted Khanmigo for inappropriate content. This relates to the larger issue of AI’s hallucination problem, especially when trying to align lesson plans with educational standards. I’m curious about their progress in addressing these issues. Sal Khan may believe they have sufficiently solved these problems or have a plan for future improvements.

 

LH (40:28.809):

It strikes me that a friend and I are working on a tool for positive masculinity, and we’ve found that building a competitive moat around tech products can be challenging. The openness of platforms allows for similar products to emerge easily, raising questions about how unique Khanmigo really is.

 

Ben Dodson (41:08.638):

Absolutely. Many platforms exist because users may be unaware or reluctant to use ChatGPT instead, even though it offers similar functionalities.

 

LH (41:20.681):

Exactly. It’s an interesting segue into discussing growth. You’re in a highly competitive space, with around 15,000 school districts and thousands of higher education institutions to reach. I’d love to hear about your growth strategy and any marketing or partnership initiatives you’re excited about. With new funding, how do you plan to scale Dewey?

 

Ben Dodson (41:53.006):

We’re focusing on three main channels: K-12, higher education, and partnerships. We’re seeing the most traction through partnerships. We work with larger ed-tech platforms that generate significant data and can benefit from offering our analytics tool as a white-labeled or gray-labeled solution.

 

Analytics is increasingly becoming a critical component that customers expect as part of their platforms, even if it’s not a core offering. The interactions with analytics are likely to become standard within the next few years. There’s a lot of complex data engineering involved, which goes beyond simply applying Generative AI as a wrapper around data.

 

We’ve seen great demand for our expertise in this area, allowing us to enter the market more rapidly. Selling directly to districts and universities is challenging, but partnerships leverage existing relationships, enabling us to piggyback on their established networks.

 

LH (43:35.177):

Do you have an example of an early success story from a partnership you’ve scaled?

 

Ben Dodson (43:41.326):

I do have examples, but I can’t talk about them right now because we’re planning announcements over the next few months for a bigger marketing push. You can ask me again in six months, and hopefully, I’ll have more to share.

 

LH (43:44.169):

Sure, sounds good.

 

LH (44:01.673):

I’ll have to have you back on in six months or a year to discuss your successes and next fundraising efforts. It will be exciting to hear about that. I understand how challenging it is to sell directly to K-12 districts and to work with various ed-tech companies, charities, and nonprofits in that space. I’m happy to share some insights we’ve learned over the past couple of years to help with your strategy.

 

Ben Dodson (44:28.078):

Absolutely, I appreciate that.

 

LH (44:29.801):

Regarding demand from third-party partners, how do you vet the companies you collaborate with?

 

Ben Dodson (44:44.494):

At our stage, we require a certain scale from potential partners. It’s essential that the platform has a significant user base and is in need of analytics capabilities. We look for platforms that might have a basic or missing analytics component, especially if their users are demanding more insights.

 

Typically, the ed-tech platform should have analytics on its roadmap, even if they haven’t invested heavily in it yet. This way, we can integrate with their existing data collection methods more easily. If they already have a custom dashboard, merging our solutions would be more complex compared to providing a straightforward, out-of-the-box analytics tool that they can offer to their users right away.

 

Another consideration is language. We’ve had international inquiries, but at this stage, we can only operate in English. It’s not just about translating user queries; we also need to translate the data and validate that process, which adds a significant layer of complexity for non-English-speaking users.

 

LH (47:00.329):

I appreciate your point about having a solid data and analytics strategy. It’s essential for businesses to implement their learnings effectively. I recently attended a dinner focused on mergers and acquisitions, which was quite British, held in a crypt at Fort Newman Masons. One discussion highlighted that companies need a coherent data strategy to compete in the near future. Those that can rapidly learn from their data and iterate on their products will have a significant advantage.

 

This is especially critical in education, where rapidly adapting to prepare students for future jobs is essential. Generative AI adds another layer of complexity to that challenge. Your point about needing a solid data strategy resonates strongly.

 

Ben Dodson (48:12.334):

Absolutely. Unfortunately, the education sector is multiple generations behind enterprises when it comes to data utilization. That was part of the hypothesis behind starting Dewey. We’re seeing many companies apply Generative AI to data, but very few focus specifically on education data. Most, like Tableau, Looker, and Power BI, are introducing features for enterprise data interactions, but education has unique needs.

 

It’s challenging to apply general enterprise solutions to specific education contexts without addressing the last-mile hurdles. Generative AI doesn’t inherently understand the nuances of education data. Each educational institution has custom metrics, but they tend to be more similar than different, allowing us to build an education-specific knowledge graph.

 

For instance, while at Snapchat, I encountered thousands of custom metrics across many tables. Simply applying GPT models to that data would yield low accuracy because the models wouldn’t comprehend the specific metrics tracked by Snapchat. Scaling that approach across different enterprises is incredibly difficult. A dedicated data team is essential to create the semantic layer needed to connect natural language models with BI tools.

 

Our focus on education enables us to develop a tailored knowledge graph. While each university might have some unique metrics, their structures and needs share enough commonality that we can streamline our approach.

 

Ben Dodson (50:36.75):

We’re focusing exclusively on education data and education platforms. By doing this, we only need to understand the data models of a few dozen major ed-tech platforms. From there, we can create a semantic understanding for about 80% of the data used in education.

 

LH (50:57.481):

How do you reconcile that with your earlier point about education needing to look different in places like New York, Texas, and Alaska? Education occurs within specific physical contexts for most students. How do you integrate those two perspectives?

 

Ben Dodson (51:06.317):

That’s a great question. The districts in New York and Kansas have different needs, but they often use the same systems, like PowerSchool, which is a leading student information system.

 

LH (51:24.937):

Yes, that’s a major player.

 

Ben Dodson (51:38.126):

Right, the major data sources for a district include student information systems, learning management systems, and assessment systems. While both districts might rely on the same underlying systems, their needs differ. We address these diverse needs by ensuring that our Gen.AI platform is flexible enough to accommodate different inputs, while still using the same underlying data models.

 

There may be some slight variations, but generally, organizations aren’t creating vastly different versions of PowerSchool or other platforms.

 

LH (52:24.553):

Interesting. I think we have time for one more question before we wrap up. Kassir, Ramaya?

 

LH (52:28.91):

Given your experience at Snapchat and Google, how has your personal journey differed now that you’re leading your own smaller tech company?

 

Ben Dodson (52:52.782):

Absolutely. The differences are significant, particularly regarding time commitment and the level of effort required. The stability in larger companies like Snapchat and Google means you know what to expect over the next month or even six months.

 

At Dewey, while we just raised funding, there’s a constant underlying anxiety about whether we will succeed and how we’ll reach our next milestone. This framing of uncertainty has been a major shift for me.

 

Another difference is the opportunity to build company culture myself rather than merely participating in an existing one. I get to shape the culture I want for Dewey and set the vision for that.

 

LH (54:23.369):

What kind of culture do you want to build?

 

Ben Dodson (54:30.894):

Intrinsic motivation is crucial, especially for a startup. As a quick aside, my first startup idea was 12 years ago—a hot tub movie theater concept. The idea was to have hot tubs and a giant screen for movies. However, after six months, I realized I didn’t have a passion for that concept, so I abandoned it.

 

This experience taught me the importance of having intrinsic motivation because it helps you persevere during the inevitable tough moments.

 

To foster that intrinsic motivation within the culture, I reflect on my time at previous startups. At one point, it felt like I was working with family because you see each other every day. If you weren’t working together, you’d be hanging out, forming strong friendships. However, I realized that this comparison isn’t entirely accurate.

 

I have two sisters, and regardless of how they perform, they remain my family. In a startup, if someone isn’t performing well, there’s no longer a place for them. So, it’s more like building a sports team: you want to get along and enjoy working with your teammates, but the ultimate goal is to win the championship.

 

If the team isn’t aligned towards that goal, they won’t succeed. There’s no tolerance for anything that detracts from that shared objective.

 

Ben Dodson (56:48.686):

It’s not personal; this is what we all signed up for. That’s the culture I aim to create—everyone is committed to the ultimate goal. We want to support each other and enjoy our work together, but we can’t lose sight of why we’re here.

 

LH (57:13.929):

I appreciate that perspective. Some people misuse the idea of “we’re a family” in companies. You don’t invoice your family or fire them for underperforming, right?

 

Ben Dodson (57:22.67):

Yeah, exactly.

 

LH (57:25.321):

I find the sports team analogy very motivating. We’ve embraced it at Literature Humans as well. You cheer each other on, support one another, and everyone has different positions to play on the field. You want to build authentic, deep relationships within that context, but there are limits. Different people have varying experiences of family, so using that motif can be triggering for those with negative family experiences. Of course, some may also have had bad experiences on teams, but the team concept is a bit looser and more flexible.

 

I appreciate everything you’ve shared about your journey in building Dewey. It’s been a fantastic conversation with Ben Dodson, the co-founder and CEO of Dewey, discussing the transformation of data for educators using AI. Ben, where can folks find out more about Dewey online?

 

Ben Dodson (58:14.062):

You can visit our website at Dewey, D-O-O-W-I-I dot I-O. All the information is there.

 

LH (58:23.241):

We’ll be sure to include a link in the podcast episode description. Thanks, everyone, for tuning in. Please leave a review if you enjoyed this episode. Help us share, subscribe, and drop us a five-star rating so we can continue bringing on great guests like Ben Dodson. Thank you for listening!

 

Ben Dodson (58:40.302):

Thanks so much for having me. It was really fun. Thanks, everyone.