Case Study

How Codecov Helped Udacity Empower Their Developers To Write Good Tests

Challenge

  • Missing a way to unify code coverage reports across teams.
  • Lacking a way to view code coverage holistically over a period of time.
  • Need a way to encourage developers to continue writing tests.

Goal/Use Case

  • To go from a word-of-mouth type of code coverage to a globally defined process and tool for code coverage.
  • A way to drill down into the metrics from coverage reports.
  • UI that encouraged and empowered developers to write good tests in an engaging way.

Solution

  • The ability to maintain coverage consistency between languages with normalized outputs.
  • The ability to monitor code coverage changes over time, at any time. 
  • A tool that fit in the existing team workflow
"Originally I was looking for rates like 'as code coverage goes up error rates go down', but what we really started to see is, as code coverage goes up, confidence goes up and compliance challenges start to get checked off. That’s been as valuable as error rate metrics for us."

Aaron Stone, Director of Engineering

Udacity is an online learning platform offering groundbreaking credential programs in fields such as artificial intelligence, machine learning, self-driving cars, and robotics, as well as app and web development, digital marketing, and more.

Their mission is to train the world’s workforce in the careers of the future. They partner with leading technology companies to learn how technology is transforming industries and teach the critical tech skills that companies are looking for in their workforce. With a goal to train 1M people in technical skills by 2025 globally, they offer free and paid courses with certificate programs called a Nanodegree. 

As Udacity adopts Codecov globally, they will be pairing engineers across teams to ensure cross-training and developer adoption of Codecov company-wide.

 

Q&A

Question: How do you measure success around setting goals of each contribution?

Answer: Uptime, NPS Scores. As an engineering team, we ask ourselves did we develop a quality product? Did users engage with it, and find that it worked for them? We make sure we have APM and error tracking enabled, and good fallback pages in case an API fails are all little details we track.

Question: How do you think about ensuring quality earlier in the development process (before you reach the production stage)?
Answer: You know with every product you start and there’s some initial phase of whiteboard drawings, hand sketches, and writing code to see what starts to stick. We try to make that as freeform as possible while still using our platform toolkit. The idea is that when you start to get out of that phase, maybe your first, 10, 15 commits on your repository are all just, kind of slap stuff straight into the main, start, deploying into our sandbox environment and just see, see what it looks like. Get a feel for what you’re you’re building. But then, even that part is going to be supported by, certain boilerplate frameworks and best practices.

Question: Before using Codecov specifically, how were you going about setting code coverage goals or targets for your team?
Answer: Yeah, so lots of programming languages have common tools that will give you that information. So some of the more savvy developers would just use those at their desktops. And in some cases, we were using other hosted services. But what we found was, getting consistency between languages getting those outputs normalized, and starting to track them over time. So that we could produce kinds of lists of who needs to improve their coverage, monitoring over time as to whether someone’s trending up or down. Which turns out to be valuable. We looked at the set of services that we were using, and we wanted to find one to consolidate that, and Codecov really met our needs.

Question: How did you hear about Codecov initially?
Answer: I first saw it when I did a pull request on one of the open-source, projects that I contribute to and started seeing this new thing with the cool, scatter/box chart of coverage elements and I thought. “Hey, that’s neat. Let me, let me make sure to put that on my, vendor comparison list.” And then I ran into you guys at a Github event and that cemented it. I knew then that this vendor means business and wants to get the word out there. So I said, “Hey, that’ll be our winner.”

Question: How would you describe Codcov in a sentence?
Answer: Codecov lets you know whether the unit tests you’ve written actually cover all the if and else cases that you wrote.

Question: Are there any specific features you’d want? Like “If Codecov did “X” it is going to move the needle for us?”
Answer: That’s a good question. I think I would, I would reframe it a little bit to say that one of our favorite features is in the moment when a project arrives at its maturity point when we tick the button if the coverage “after this pull request is applied if it reduces coverage for the application, we’re going to block the PR”… That’s when you know that the teams got rhythm around and rigor around testing.

Question: How was the onboarding experience?
Answer: Single sign-on is super useful. Our company had an initiative last year to prioritize development tools that we could, enable single sign-on with, which wasn’t like a bank braking feature, sometimes it is. So we really wanted to have that. And then there’s the visibility, the automatic reply on pull requests onto GitHub is super useful. When someone clicks on that, they flow through that single sign-on process and land into a view where they can start to get more details and tease out more of what exactly is going on in the box chart, what it means for them and for their particular PR and what their code coverage looks like. I like that Codecov automates and gamifies things. The engineer can just flow right into it, figure out what’s going on, get rid of their little red X. Codecov makes compliance totally self-service, fun, and kind of great.

Question: You are as a company focused on learning and teaching. How have teams taught each other about testing practices and code coverage? How do you go about educating and building alignment across teams when they’re starting with a tool like Codecov?
Answer: So that’s always an onboarding challenge, all the different tools we want you to use. Codecov in particular has fit in and grew very organically. So we didn’t do a major push to have everyone hit a target percentage of coverage. We’ll do that down the road as an important part of our compliance program. But, like other parts of our compliance program, we felt and experienced that taking a slow build, builds developer confidence and engagement, before we say, “That thing you were excited about now it’s mandatory.” We try not to start with the mandatory part. We have a lot of, cross-functional, and cross-team features. So we are very intentional about making sure to pair up developers between two different teams. If one team is using coverage tools and the other team is not, we pair them up which helps to cross-train the teams.

Question: What types of wins have you seen across teams when they start measuring coverage as a metric or even start blocking PR’s for certain quality metrics like coverage?
Answer: I think the biggest win has been an increase in confidence, more than anything else, especially for more junior developers. That feeling of, “You’ve got your code, it’s been reviewed and approved, and you are ready to ship it. If I click this button and my code goes live it’s on me if I broke something.” Codecov helps developers get over that and make people more comfortable as they progress.

Question: Has Codecov illuminated anything about your engineering management platform strategy that you hadn’t thought of before?
Answer: Yes, one of the things we found was that developers were first, excited to see the replies, excited to see that information, then sometimes confused by it. And then wanting to get more information. We really found that it’s important for more people to get in and see the full user interface in the Codecov environment to get the details. The high-level snippet is enough to get users interested, Codecov makes it really organic to find that drilled-down detail if you need it.

Question: Have you noticed any correlation or success in changing error rates, uptime, etc. as teams have adopted Codecov
Answer: I didn’t find any strong correlation. Originally I was looking for rates like as code coverage goes up error rates go down, but what we really started to see is, as code coverage goes up, confidence goes up and compliance challenges start to get checked off. That’s been as valuable as error rate metrics for us.

Question: You’ve mentioned a few times this idea of compliance and compliance challenges. When you say that, are you referring to internal compliance or external compliance attestations such as ISO or SOC?
Answer: Udacity achieved ISO 27,001 compliance earlier this year. And we’re super excited by it. It definitely caused us to up our game. There were a lot of areas where we had best practices that were word-of-mouth, we switched from that over to a defined process/a defined tool for example Codecov.

Question: Is there anything else that you’d like us to share about the team’s goals or accomplishments, anything that a prospective developer who would ever think about coming to work with, you would want them to know about your engineering team?
Answer: Something that’s really important to us at Udacity is that we want to make sure that while boilerplate tools enable convenience and time to market benefits, and consistent between our applications, we also really value when we can help people to learn what’s under the hood of those frameworks and how things work all the way down. I think that unit testing tools coverage tools and browser automation tools are all excellent. They’re all pretty high-level. So something that we do regularly is a weekly deep dive. We want to see what’s under there. We do tech talks and we’ll drill things all the way down to make sure that we understand what’s happening. That’s something we value, we want our staff to learn to look under the hood, which is also the thing we really want to help to teach at Udacity. And I think it’s really consistent with our mission that we teach people not just, “We want to get you a job using this framework”, but also second-level courses that help you get underneath it. Because we think it will really add value long-term to your skill set to understand the why and the how at deeper levels.

Before we redirect you to GitHub...
In order to use Codecov an admin must approve your org.