OpenIDEO is an open innovation platform. Join our global community to solve big challenges for social good. Sign up, Login or Learn more

Sidekick Education

Our AI collects data on how students solve problems during company-sponsored projects then uses it to connect learners to ideal careers.

Photo of Chris
18 12

Written by

Table of contents

  1. History
  2. Learn More
  3. Problem
  4. Meet the Cast: Users
  5. The Solution
  6. Learn More
  7. FAQ


History

When co-founder and CEO Chris was 18, he made a promise to change the perception of what a student could do. Sidekick makes good on this promise by allowing young adults to work side-by-side with industry experts to complete real-world projects while ensuring they achieve greater academic outcomes than traditional instruction.

Learn More

To avoid redunancy with the new questions, we've shortened this aspect of the online submission. If you're interested in learning more about the problem, solution, or business model, check out the new questions or review the attached Executive Summary!

The Problem

Two decades of research show that authentic-audience, project-based learning is potentially the most comprehensively effective learning method, with 3% to 300% improvements in academic and high-stakes testing outcomes. Studies have shown that project-based learning also can be an effective lever for closing the achievement gap across sex, race, and socioeconomic status. These benefits aren't secret--80% of professors report wanting to bring learning through relevant real-world projects to their classrooms. Yet only 1% do.

This is because it is 3 to 5 times more effort to first find a real-world partner with a suitable problem to solve, align the project to desired learning outcomes, and then adapt the supporting instruction and materials to the inevitable twists and turns that come with a real-world project.

users: meet the cast

Early into the Refinement Phase, we recognized that a critical assumption that had long gone unquestioned was the adoption and engagement rates of students. Students input is crucial to the long-term sustainability of Sidekick, because the inputs they provide is what feeds our AI.

We conducted 15 student user interviews at multiple universities in New York and Illinois. From these exploratory interviews, we learned how little we knew about these users! But we also recognized a couple promising themes:

  • "Stay on top of my studies:" far more top-of-mind than careers were simply doing well in class. This paired with the confirmation that students struggle working in groups, presents an opportunity to fill an academic need while serving our corporate recruiters' professional one by building a companion app for students that proactively drops "breadcrumbs" on how to solve problems in exchange for activity data.
  • "Very, very undecided:" Even late into undergraduate careers, students often did not know what they were doing after school. They use internships to explore what they like, but they can only do so many of those, and they are substantial commitments; meanwhile, most new hires know within the first week whether they will stay at a company or not. They also found career advisement centers sometimes  underserve them, leaving an opportunity to bring real-world skills and career exploration directly into the classroom.

We also reviewed dozens of prior interviews of other all four key users. The following personas were developed from these data.

(It should be noted that our business model ultimately hinges on payments from corporate recruiters, but given our recent focus on Noah, for this competition we wanted to think about how our product helps the students.)

Image title
Image title
Image title
Image title

solution



FAQs

How do you assess the individuals within a team? Would you be able to identify if the team is a Moneyball GPA

We've integrated a few ways to assess and ensure learning and progress for individuals (although it's worth noting that the research here suggests this is not as large a concern as it intuitively seems). First, however, it's important to know what the underlying mechanisms used are.

  1. We have several incremental checkpoints built into projects where students complete authentic assessments in the form of actually-useful work products. These provide us snapshots of the student team's activities and progress along the project, but they also slot in nicely for traditional evaluation from professors, TAs, and peers.
  2. Unlike most innovation competitions where concept development resembles a traditional funnel, we expect sponsoring project partners to check in on a regular cadence. We obviously don't expect partners to have to weed through every single concept every single time, so we build in a peer evaluation first where teams pitch their hypotheses or solutions and other teams can "Adopt" the same pitch. This is similar to OpenIDEO's "Inspired by" or "Create Team" options, except an adopting team is effectively giving up their own idea for someone else's. They have to have some actual conviction in the adopted idea, so they take a more rigorously critical eye, certainly more than one would to give a Like or an Upvote and even more than if crediting someone with contributions. The count of pitches adopted then becomes an important proxy for the quality of the student team's pitches.
  3. Project partners stay on the platform after project completion to report back on a) whether the final deliverable was implemented and then eventually b) whether it was successful (the Sidekick team uses reengagement campaigns when necessary). These milestones are then linked back to the project and the student teams that contributed to the final deliverable.


Now, to answer the direct question:

  1. Many professors and TAs are not new to group work, and those that are are still trained educators and experts in their disciplines. Our work products give them the freedom to assess work against their personal or institutional standards. Also, because these projects are automatically managed and much of the learning is self-directed, educators are freed up to observe, intervene, and evaluate students individually.
  2. Sidekick gets more powerful with repeated use. Although educators are free to override Sidekick's recommendations, our team matching algorithm balances diversifying team capabilities (shown to improve learning for the lowest-performers on the team), clustering students around what they need to learn, and assigning teammates in a "round-robin" fashion. This allows individuals to start to differentiate around adoption, implementation, and success counts (what's also exciting is the potential covariances between team members and the insights that will reveal!). It is a bit similar to the Sabermetrics approach proposed by Moneyball GPA.
  3. Finally, don't count out existing assessments like tests! While the future may find a better tool for assessment, our goal is to create something that can be used today while riding that wave into the future. Research shows that real-world project-based learning, when done well, results in better performance on high-stakes testing.


 Would you be able to give me an example of a project?

An early project partner is an international nonprofit that runs a network of schools in emerging markets for children marginalized by disabilities. They've seen great success and are looking to expand their operations. The problem, however, is local public opinion. These disabilities are still taboo in some areas and community resistance and restrictive local policies are mounting. Student teams were tasked with delivering an overview of the cultural and historical context the organization was operating in and a recommendation to the key question: what is their best next step? Should they look to raise money in the States (where they're headquartered) to combat the resistance or should they invest their time more in building bridges with local leaders? Naturally they should do both--where should their priorities be?

Such a project obviously fits into several disciplines, and student teams in independent studies or cross-disciplinary courses could tackle the project naturally while Sidekick ensures they learn. Or, educators can "filter" on the desired topic, adding constraints on how students can approach the problem (this limits authenticity in our view, but it's an understandable decision that we leave up to the educator). Because we source teams across several classes or even institutions, we diversify the disciplines on the project to ensure the partner is still receiving quality options.

Who is your idea designed for and how does it reimagine higher education to support the needs of tomorrow?

Our paying users are Heads of HR and Talent Directors. The *most important* users are ethnic minority and 1st-generation undergraduates. Both college persistence and STEM career representation of these students are abysmal. Our solution helps higher education systems address both major issues because we a) provide real-world project-based learning proven to have outsized academic gains for these students, b) give students chances to develop skills and networks, and c) reduce hiring biases.

This idea emerged from:

  • A group brainstorm
  • A student brainstorm

What skills, input or guidance from the OpenIDEO community would be most helpful in building out or refining your idea?

We have a working proof of concept for the simplest version of our software, but as our algorithms get more sophisticated we could use additional data science expertise and suggestions. Additionally, a big risk for us is ensuring students and professors adopt the authentic assessments we provide, which is how we collect data. Strong product and UX skills as well as all additional insights into our users' behaviors, thought processes, and needs would be a big help.

What early, lightweight experiment might you try out in your own community to find out if the idea will meet your expectations?

We're in the process of implementing of our first major pilots with educational institutions in March. In the meantime, we can use an MVP (sales collateral) to test "Mafia Offers" with potential sponsors (companies). This is as simple as starting with our immediate networks' employers.

Additionally, we need to prove that students will use our value-added authentic assessments throughout the project, which is critical to data collection. This we can test with concierge pilots.

Tell us about your work experience:

Chris leads product and technology and comes from startups and finance, most recently as an analytics product manager before pursuing his MBA. Jeremy heads up curriculum development and strategy and comes from education as a TFA alum who spent 6 years in a college prep classroom. Brian handles operations and business development and started his own education foundation back in Hong Kong. Safia manages relationships with corporate partners and higher ed accounts and hails from tech and education.

How would you describe this idea while in an elevator with someone? 2-3 sentences.

For recruiters hiring recent grads who need to beat other employers to rare talent while reducing the frequency and cost of "picking wrong," Sidekick is a recruiting platform that turns real problems into syllabus-aligned class projects so they can get authentic answers to how well candidates solve their problems. Competitors provide project management tools. Sidekick ties projects into academics with proactive AI companions that make projects easier and more enjoyable for everyone involved.

What is the specific problem your idea is trying to solve? 1 sentence.

Even though real-world experiential learning could solve both the excessive costs of recruiting recent grads and the disengagement faced by current undergraduates, the preparation and management of these real-world projects has prohibited most of higher education from using it.

How is your idea different or unique from what is currently on the market?

Early on, our key differentiator will be the companion app for students. Ultimately, though, our proprietary technology gives us unfair advantages. We're the only ones that can integrate corporate recruiting directly into classroom concepts, achieving similar prices to our competitors for a superior product. Because it's a data product, we benefit from a built-in “flywheel” that makes it hard to compete with us: the more projects we get, the better our product works at the same or lower price.

How do you plan to measure the impact of your idea?

Crucial to the Sidekick's success early on is student and professor engagement and satisfaction, so our early KPIs will focus on positive growth for these metrics between sequential cohorts. After early validation that this is indeed something our users will adopt, we can measure impact hiring patterns. The obvious is the up-and-to-the-right count of recruiters, students, and hires from our platform. We can also benchmark our diversity of hired candidates against conventional recruiting.

How might your idea be transferable to a large number of people?

Initial positioning and targeting will focus on business, engineering, and computer science schools, given their familiarity with experiential learning. However, we envision our machine learning algorithm learning an academic ontology that spans across all disciplines, so any professor could use career-connected experiential learning in her class. It'll be commonplace for History majors to stand toe-to-toe with Business majors for that consulting job based on their project performance profiles.

What are your immediate next steps after the challenge?

We'll continue to serve our pilot partners and focus on product development. Something we learned in this Challenge was the importance of the student, and so we have a renewed focus on upgrading the student product.

18 comments

Join the conversation:

Comment
Photo of Stacey
Team

Hi Chris,
Really interesting idea! I'm on the faculty at Seattle Univ and you highlight an important challenge for me in my courses. Would love to learn more as this project evolves. How do I stay connected?
Best,
Stacey

Photo of Chris
Team

Hi Stacey! Thanks for your interest and we'd love to stay connected. I've been in the air more than the ground the last few days and am heading out of town with limited internet access again for the next two weeks. Mind if I circle back with a more robust answer once I have my bearings again? (I found you on the web so I can email you at Seattle U!)

View all comments