Launching GitHub Copilot

With guest Alyss Noland and hosts Matthew Revell and Adam DuVander

Alyss Noland shares what it really takes to bring a developer product to market, from navigating internal politics to making decisions that actually resonate with developers.

Episode outline

12:45 – Taking ownership of a high-stakes launch: Alyss shares how she was brought into the GitHub Copilot project, balancing multiple high-priority launches while managing stakeholder expectations in a fast-moving environment.

18:20 – Navigating internal challenges and executive priorities: Alyss discusses the complexities of launching a product that was part of Microsoft's partnership with OpenAI, dealing with cross-company coordination, legal considerations, and executive visibility.

24:10 – Defining developer go-to-market strategy: Alyss explains the key differences in how developers evaluate products, emphasizing trust, transparency, and the importance of hands-on experience.

30:45 – The role of early user feedback in refining Copilot: Insights from early testers helped validate key value propositions, refine messaging, and identify gaps in usability and safety.

37:15 – Making pricing decisions with limited data: Alyss describes how the team approached pricing, balancing cost considerations with adoption goals while keeping the developer experience in mind.

45:30 – Overcoming obstacles and learning to push back: A behind-the-scenes look at how small decisions, like trial duration, impacted user experience and internal teams, and the lessons learned about advocating for better outcomes.

52:40 – Advice for marketers launching developer products: Alyss shares key lessons, including the importance of user conversations, setting aside ego, and navigating internal conflicts as topic-based rather than personal.

Transcript


Matthew: Hello and welcome to Developer Marketing Stories. My name is Matthew Revell.

Adam: And I'm Adam DuVander. Today we are hearing from Alyss Noland about launching GitHub Copilot. This was a project with a lot of attention and this is a story from someone who was there at the very beginning.

Matthew: One of the reasons we're doing this podcast is to share with you about our training and coaching programme for developer marketing professionals. It's called Developer Marketing in Practice, and you get a mixture of coaching, learn at your own pace training, and also live sessions with experts in different areas of developer marketing. If you want to find out more, go to developer marketing.

Adam: One of the things that stood out to me about this is it has something for anyone listening, it has the elements of a startup. So you have a brand new product, something that has never been done before, and you have to understand what it is so that you can tell developers about it. And then it also has all the elements of working for a large company. So this is GitHub, which itself is big part of Microsoft, which is even bigger. And so you have all the collaboration and the sign off and the discussions that need to happen for a large company all wrapped into a great story about what it takes to build and launch a developer product. Alright, Alyss, welcome and we're real excited to learn about your experience with Copilot. So take us to that moment where someone told you about this new product and said, we need your help.

Alyss: One day I was effectively one of three people sitting on the product marketing team at the time, but I was one of the only people actually working on product launches and by one of the only, I do kind of mean the only, and so I was pulled into the Slack channel along with the executive sponsor and a few other of the important individuals while this was still with the innovation team, GitHub next. And they said, we're working on getting this thing to an alpha state. We want to get it towards being in the hands of some developers and we need a marketer on it, specifically a product marketer. And I had already built a lot of trust with the product team and built a lot of trust with some of these specific individuals having worked on Codespaces. And so I'm coming up to the Codespaces launch, I'm coming up to the new issues experience launch.

I also had State of the Octoverse reports on my plate. Basically every P zero launch for about a year period of time was on my plate. And now I'm thinking about, now I have to add this other thing and I'm also supporting 90 product managers for the entire GitHub org. So in the moment I was like kind of come hell or high water, I need to figure out what my priorities are and what I need to reshuffle and how to kind of redistribute some of these things or what's going to drop because I can't let this go to chance. I can't let this go without my contributions, not because I think so highly of myself, but because it's so rare sometimes in dev tools to have that trust of the product management team. You don't want to forsake that trust and leave them hanging. The other thing is because there had been so few launches of this kind in the recent or historic past that was going to be a very tricky space to navigate, not only from the unknown legal standpoint, from the community standpoint, from the how are you doing the science communication and how does this thing work?

So I was thinking about all those different aspects of the challenges and the nuance and it was definitely a career builder. So I was in. I needed to figure those things out. And so the initial response to by my skip level was like, oh, well the PMM team doesn't have bandwidth. And I was like, okay.

Adam: That was the response to the executive sponsor who is coming with this P zero that sort of needs a new number to be associated with it because it's not low enough, right? Is that

Alyss: Yes. Yeah. And this is, I don't think folks have quite understood yet at the time that the GitHub copilot initiative was one of the first pilot products coming out of the relationship between Microsoft and Open AI from the whatever, the number of months, number of years exclusive deal and the billion dollar plus investment. And so Satya's team, and because Satya Nadella has not only his office of the CEO and he has a comms team and PR and they wanted to know what was going on with it, they wanted to know that there was a plan to get this product to market. And so that was one of the things that was like a disconnect between the orgs at the time was that level of impact and importance and the attention too, what exactly this thing was going to be. And so that means that that was sitting with this very scrappy kind of skunkworks researchers who could flex in a lot of ways.

They could be researchers, they could explore different concepts, but they hadn't worked with marketing like demand generation, they hadn't worked with the content marketing team and what does the editing process look like? They hadn't worked with programme management. And so I fortunately already had started to build some of these relationships with a few of these folks and definitely got a closer relationships in the meantime, but no product had ever crossed that barrier. No product had ever gone from at least as long as I had known, or anyone that I had spoken to had something gone from GitHub next into GitHub's product and engineering org. And that was still quite clunky and awkward and definitely some things that could have been done better, but by all accounts, the teams that were helping support that tried to give a lot of grace and tried to offer support to the extent that they could. And you get a little bit of that when you have more of the startup culture.

Matthew: I dunno if this is relevant, but does that mean that GitHub Actions came after copilot or was it just, did it not come out of GitHub next?

Alyss: GitHub Actions came before copilot. It preceded me. So I don't have all the details on how that might have been explored. I was the most tenured person on the product marketing team at the time that I was doing all of this. And I think I had been in product marketing at GitHub for a year.

Matthew: So going back to that point where that it's your responsibility now, it's the first time something like that's happened within working memory, let's say, did you have a playbook or something you turned to, what was your day zero, day one set of tasks to orient yourself?

Alyss: Yeah, there's not necessarily a lot on the playbook. I mean I guess there is, my approach on almost any product is you have to start understanding who is it for, how people are going to use it, not just you definitely want to understand the theory. And so a lot of it for me initially as I was joining this kind of working group was going and sitting in meetings and listening to the way that the lead at the time, his name's Oege de Moor and he's since gone and founded his own new startup. He had been brought into GitHub through an acquisition and I would just listen to the way that he pitched the product to different teams and he was very crisp on these things. He also very much understood the value of marketing, and that was part of how I kind of firmed up my idea of where are they seen because they've been playing with this thing for a while.

It started out in a browser, now it's like they have been doing it in VS code and moving it towards the IDE. I thought that makes total sense. Why wouldn't it be in an IDE? I took some of these decisions for granted. There's other things in there too because that's very much focused on the product marketing side of things. Like grokking again, the user empathy. Where is it in their experience? Is it frustrating?

Is it making things easier? Understanding the value. This also came around the time that we already had a new paper out of Microsoft Research from Nicole Forsgren as the lead author called SPACE.

It's called the SPACE of Developer Productivity, which is a framework and a predecessor to the four key DORA metrics. And it's a way to look at individual team and organisational productivity, but it's not prescriptive. It's not saying like, oh, if you're going to be looking at the stability or the throughput of your team, then you need to measure these things. It's saying you need to look at satisfaction and wellbeing. You need to look at sure, some productivity metrics or activity metrics or communication and collaboration. You need to look at the way efficiency and flow. So again, how do those things fit in? Those were helpful and very grounding and things that I continue to go back to.

But then there's the other details. I call myself the go-to-market lead on copilot because I didn't just do product marketing on that product. So it's not cheap to run models, it is not cheap to continue to build all of the features associated with what is effectively an agent style.

It is not providing autonomy to the model to make decisions. Still it is a copilot in that way and that is why we decided to keep that with that name. And also, I really want to give credit to that team. They came with that name. I did not impart that name upon them. UGA actually I think was the one that originally came up with it. We persisted it. We didn't have an agency come up with that name and now all of Microsoft is using it.

So congratulations to Oege. Hats off to him. It was a very astute observation that that was very much like the experience of it. And so it's like branding. Do we want this to have unique logo marks? Do we want this to have a unique look and feel to what's happening with GitHub?

What are branding colours going to be? How are you going to think about some of the storytelling, especially because again, this isn't just GitHub services, this is also open ai, this is also hitting Microsoft servers. How are you going to opt into privacy policy? And then going back to open source community, and now I'm even talking about what's the legal impact of these things? What's the community reception of these things? How are we anticipating the reception? How are we laying the groundwork? How are we anticipating the way that's going to be received?

And then also how are you seeding the community? How are you warming people up to these? How are you building the ground swell and the communication around it? How are you testing it out and validating that this is a good experience and that you've smoothed smooth all the rough edges? It's also anticipating when is your executive going to scoop you?

I won't specifically name names, but you can imagine on the heyday of Twitter that there were certain executives that were very active on Twitter and when they felt that it was the right moment to do so, they would post things on Twitter and we didn't always get to do our little embargo dance with our press friends. So sometimes that meant that we just had to react when we were reacting. And so I've touched on lots of little things, but in the whole package of this is really looking at if we're taking a step back, some of it's like market trends. Where do we think this is going to go? How do we think this is going to impact the global community and sort of the direction of things. How do we think this is going to impact people's ways of working on a micro level and thus on a macro level and a five to 10 year time horizon, what are the economics of it? Where do we think a payback is going to come from, if at all? And do we have a responsibility towards a payback or our KPIs and key metrics for this project different?

And I think you can see in a lot of ways that the key metrics in some of the reporting have been different.

The core was really about loss leading and driving developer adoption, and that's part of why it's still free for a lot of open source maintainers and core contributors to those large projects as well as I think they might still be doing discounts of some kind. Then it's also how do you navigate the education landscape as professors and compsci teachers and high schools? They didn't know this stuff was coming out. I literally sent an email to my old compsci teacher from high school and was like, Hey, we just launched a new product. People are probably going to use it in order to work on your homework. You should just be aware. And he's like, thanks for letting me know. And you can imagine there's a lot of downstream effects of that, but I'll pause there. I could keep going on, but

Adam: There is a lot in there and it's clear that internally there was deliberation and intentionality. What's the timeline here from you here about it in that slack, slack channel to maybe first public announcement of any sort to developers being able to get their hands on it? What is that timeline?

Alyss: The timeline from I'm brought in to help out to, it's available for people to sign up for. The tech preview was about three months, which was about standard fare for GitHub at the time is like, oh, product marketing only needs three months to do all of the go-to-market

Adam: And maybe so for a product that is similar to something that someone's used to using, but this is a whole shift. That's a lot that happened in that time.

Alyss: It is. There are some things that are made a little easier because it was a tech preview and that gave us some flexibility from the tech preview to ga, we had a year, I think that was what we're talking about is the difference between about June, 2021 is when the tech preview became available and when that wait list became available, and then it was about July, 2022 when the GA was announced, made available, and then there was a two month free trial off of that. So then no one paid until about August, 2022. There's of course details in there, but that year was a year of iteration. It was a year of letting over a million developers in to try the product out in a staged approach, can we deal with capacity? What metrics should we be paying attention to in terms of the way the product's being adopted? Can we start to refine some of those metrics? Are they the right ones to pay attention to?

So I would say both still felt like a crunch. It still introduced certain bottlenecks and limitations. So one of the ways that came out was when we were thinking about routes to market and thinking about how is this going to be adopted. The first is GitHub is a predominantly US company, even though it is an international, I feel like this is the quiet part, you're not supposed to say out loud. It is used internationally. I don't want to discredit it for that. Tonnes of developers in India and all over the world, but if you're going to price something based in USD and not localise currency and not localise pricing, then you've priced it for a US market, plain and simple. So then we're thinking about a couple of things there, right?

You're thinking about what is a US developer largely willing to pay in order to maintain access to this?

And also what's a consumer willing to, what's a pricing model that's palatable to a consumer? Because on an enterprise front, what we saw at GitHub, you can see this with actions with a few other things, especially over in the Azure part of the world, people want to do the commit, spend, commit, burn down, they want to pay for what they use. They don't like the seat based pricing because then you have to do seat reconciliation and you have to think about does this person actually need access to this? And that itself introduces challenges. Adam, I think you've done some breakdowns of prices and pricing models for different companies. You can get into all of this and its own philosophy. And so again, backing up, if you're going to think about you want to get this into the hands of developers that it is going to be a basically brave new world and it's very innovative and it's going to present risk and it's going to require people very early in the technological adoption curve that are willing to navigate that.

Then how do you get it into their hands? And at first it's free trial. Give them something that they don't have to enter a credit card for. And so the technology preview, technical preview was a great way to do that. Then it was 60 day trial. I don't really think they actually needed a 60 day, it's not 60 day any longer. It's now I think a 30 day or maybe less. And then we were effectively choosing somewhere between 10 to $20 is where we had run some models on.

We also asked and surveyed users in a couple of different pricing methodologies where what do they feel good about paying if would you be sad if you were going to lose access to this and if you did have to pay what would feel right? And 10 was kind of validated, but also the survey that was run at the time wasn't necessarily intended to be a pricing survey.

And so we wanted, I would've preferred us to do something with a little bit more rigour, but you had enough converging gut instincts, validation for the markets that we wanted to enter. That's sort of where we landed. And that also comes with a lot of asterisk because again, $10 a month for the GPUs and for everything else that's going into it is not necessarily enough to pay the bills and it's certainly not going to be enough for continued development and continued features. And so then what are you going to do there? And my initial thinking was like, do we want to do something where we say, Hey, for the first two years we're going to give people 50% off and do we want to say for a certain period of time we're going to, for an unknown period, we're going to continue to give people a discount or a coupon code for being early adopters and then at a certain point after a million people or whatever have joined, then we'll stop doing that and then once we introduce some pricing tiers, we'll create some more differentiation in that, which as you we'll have now seen, and something that we had already been talking about at the time that we were going through these conversations was that the business tier and enterprise tiers were going to come with different features and you try to separate those as much as you can for the needs of those teams and teams of teams where developers should hopefully not need those things and you try not to put things into those tiers that developers would need because that is not a really good way to structure products for various reasons and then price those things higher.

Matthew: So I wanted to take a step back. You mentioned being go-to-market lead for this and you've talked about go-to-market several times. What is a developer go to market?

Alyss: I think there's several things, not the least of which is that developers have what I like to lovingly refer to as a relatively strong bullshit detector, and I think that's depending on who or your audiences, maybe it's security professionals, maybe it's researchers in the academic sense, they might have even stronger bullshit detectors. They're looking for a certain level of evidence-based information and science communication and fidelity. You can't be fluffy, you can't say this is going to prevent you from introducing a OWASP top 10 because it's not going to, you still have to review your code before you commit it. You still have to go through PR review. People still have to think about the change set that they're going to introduce in terms of a pull request. And unfortunately some people aren't doing that and those are just us perpetuating the same sense of years past prior to copilot existing, so like developer go to market. I think the foundation of it is starting with that.

You can't shy away from learning about the tools, learning about what the audience is using, learning about their day-to-day and what pains them simply because those are not a part of your day to day as a marketer and because they might be perceived as scary or as foreign, the number of things that human people are willing to go to a Michelin restaurant and try something that they've never tried before. You should be willing to learn about or try a tool that you've never tried before just as quickly and separately from that, it's also not that dissimilar in that it's a lot of community and it's a lot of word of mouth. So there are things that I can say as a brand that are a little bit more difficult to say as an individual, which are sometimes the claims, here's what we've seen people do with copilot, but then there are things that as an individual myself as Alyss Noland or Martin Woodward who's currently the VP of developer relations, there's things that those types of individuals can say that the brands cannot, and you have to think about that as a tool in and of itself.

And that's one of the great things about the programme like GitHub Stars is bringing those community users into the fold and having that trust and inviting that, if you want to call it criticism, if you want to call it feedback, I think Mary Al has represents the community to the company and the company to the community. And the more that can be a very intimate shared slack channel, bring that as close to where people are working as you can. Not only because people like your community has day jobs and you need to give them easier access to the way that they're providing this information to you and you shouldn't expect them to show up wherever you are, but also the people that you're working with, your peers have that one space that they're working in as well and you want those feedback loops to be really tight.

I'm trying to think of some of the other little details in there. Even going through, we were thinking about the popups that you do as you go through the trial expire in GitHub copilot, what goes into the email and again the claims, what information did we need to have in the FAQ? Again, all this kind of goes back to from a go-to market lens and how developers are unique. They do want to know how it works. They do want to know how it ticks. This isn't a trust your aunt because your aunt made this recommendation to you. This isn't an MLM. It is very much a little bit of trust and verify.

Some people will go and kick the tyres, but that's still the verify part. And so the developer buying process is there's that initial exploration, that deeper exploration and then the willingness to try and you do have to build them up towards that willingness to explore the thing and be willing to go through the process.

And you have to make that process not suck. So have you tried to download it? Have you tried to go through that setup process as a product marketer or as a marketer? Have you tried that? Because if you haven't tried that and you haven't given that feedback to the product teams, then it might be a bad experience for newer people. And if you haven't lowered that barrier to entry, then the more that you can make that delightful for someone who is newer entry or lower skill level, the more it can also be delightful for someone who's higher skill level. And so there's this big bell curve and learning curve and you also have to think about very conservatively, who are you designing for? Because from the get go, this really was not going to be for code school students.

It certainly didn't start out that way. Now it can be in a slightly better state for people who are using code school, but you're still going to end up with goop if you don't understand what it's asking you to do. It's like sending something through Google Translate and handing it into your Spanish teacher. And so that was another part of what story are we trying to tell and what's going to be healthy for the long-term impact to the market that we want to have and the current state of the product. So again, I could continue to extrapolate on all of that, but I feel like those are some of the high notes.

Matthew: So one of the things I've always thought is that you can alienate either audience by catering to them do well one way or the other, and that's one of the great ways of differentiating developer experience and so on is by knowing your audience. So I just wanted to understand a bit more what you meant about how you can basically serve both audiences

Alyss: When I'm talking about making things easy and delightful and serving a newer developer. And the particular stage of things that I was really thinking of is that getting started process. It's things like how many steps do you have to go through to set up A CLI or to configure? How many times do I have to open my Vim editor in order to mess with this? And then what other packages do I have to install and do I have to switch the do I then have to manage, am I on Python three? Am I on Python two, the more that you have to go through some of those various little institutional knowledge steps, the more that you start to gravitate into intermediate and higher level expertise knowledge rate. If I'm asking someone to do an interactive rease, I might as well say this isn't for beginners.

If you have not been using GI for at least three years with a group of people because it would necessitate that you deal with some degree of merge conflict, then you shouldn't touch this at all. But when I'm thinking about that getting started process, I don't think anybody likes to sit there setting something up. That's not what people are spending their time trying to get to. That's not the aha moment. So the more time that people are spending trying to get to the aha moment. And so that's really what I'm referring to there is the more that initial developer experience thinks about a novice thinks about the acolyte who is first touching the product rather than says, oh, we're here for the old crusty IRC dudes. So they'll figure it out. The more that you're creating your own leaky funnel and leaky bucket, but I don't mean that to be a flattening of what audience you intend to serve because that can be very alienating and that can lead you to build a product for too many people.

I do still think you should be very deliberate and very deliberate. It's a perfect word for it. If I'm building for the automotive industry or if I'm building for the games industry, if I'm building for the games industry, I should know that they're predominantly going to be using perforce or some other centralised version control because they don't want to check out the entire repo. They're going to have large files that they need to store LFS isn't going to work for them. And so what they're going to need to do is they're going to need to do very specific checkouts of the exact branch or the exact part of their repository that they need to work in and get can remain the domain of people that are working in SaaS software microservices. Fine. That's okay.

Adam: Kind of on that, the understanding the audience piece, you mentioned the scepticism and trying to guess at what that scepticism is, but you also mentioned there's this early group that you had access to be able to verify that. Can you talk about something that you learned from that early group that you were able to use to be able to have a better launch or better experience for the people that came after that?

Alyss: Yeah, there were several things in different regards. So the first was, I would say validating the initial hypotheses of the value pillars. We had these guesses, Oege had his idea about what the best use cases were. So hey, you have to write boilerplate, like repetitive code. And really the way that kind of framed itself up and the messaging house, the messaging framework that I had written was in satisfaction, increased developer satisfaction by reducing the boring shit and the boring toil of these are repetitive aspects of coding. And that I think got relatively well validated. We had some good feedback from one of the early founders from Instagram, I'm trying to remember his name. He's now CTOI think somewhere I am blanking on all these particular details.

There was some other aspects as well, which is like how much is this going to help someone pick up a programming language that they're less familiar with?

Because that ended up being a little bit more of a exploration I would say was kind of a secondary, but a tertiary was almost towards innovation. Am I a product manager who can't code as well and I can pick this up and this can help me. That really depended on how well someone already spoke computer, if you will, because you can't go write a comment that's just make an e-commerce cart and it is not going to give you enough. It might be able to find a package that's already been made, but it's just as likely to make up a package name that's never existed before than it is to actually give you all the code that you need. You have to be a lot more specific than that. You have to like, Hey, I'm going to instantiate a cart function, I'm going to instantiate a checkout function.

I'm going to instantiate, I'm going to have to bring in Stripe to do all these things. And so that innovation space was a little bit more fuzzy and not something that we brought up as much. There was a little bit of validation in that, but also it happened so infrequently that it also felt like somewhat the antithesis in that there were some researchers who, because they had read enough about some of these deep learning algorithm topics, they could write the comments that they needed to write, but they didn't know the code well enough. And so they were able to use it to help them get to the point that they needed to in Python or whatever. And that made some of their work go faster and it help them innovate, move forward. And unfortunately it's not like I have an example that I can really show in that one.

And we had a few other people that talk to those things. But again, that happens so few and far between. Those are more unicorns than they are almost anything else. Some of the other good examples were people who were looking at the safety behaviour of a copilot. So can I get it to use curse words? Can I get it to use some of the other ISM type language that we tend to see as what are some of the other escapes, what we see as prompt injection or prompt escapes now? And they were very good at letting us know if they had succeeded in doing that, if there were things to be concerned about or people would just post about it on YouTube or Twitter and it's not necessarily something that you want to see a lot of.

And sometimes those things take a life of their own, but at the same time that is someone red teaming that is giving you product feedback. And so they were useful in that. So there was some validation in the safeties that were present. Were there any other steps that we needed to take on the client side to handle some of the prompt? And again, similarly just the utilisation? So one of the things that we were looking at early on was the suggestion acceptance. And so if we did see some reduction in suggestion acceptance, then we eventually there would be a period of time after enough prompts or suggestions were kind of ignored that we would kind of go into the silent mode. We realised, hey, we're actually interrupting your flow.

And you might also think about that as like, oh, well maybe it's also saving you compute.

Maybe it is, I don't really know. It wasn't really a factor. It was actually very much in that efficiency and flow thinking, how do we make this good for a good experience for developers? And then we are also looking at it on a language standpoint. So are people accepting more suggestions and Python and JavaScript? Are there more users in any of these tools? And then we are seeing some fuzziness sometimes in certain DSLs or certain frameworks. And sometimes that was just based off of the training data size.

So some of that is a little bit more product oriented, but again goes back to messaging. And how are you thinking about the way that this stuff gets communicated back out and what are you putting in safety and trust centres?

Matthew: You've already touched on this was a lot to get through. It was there were moments of crunch and so on, but were there any particular low points where you just felt it wasn't going to be possible to get it done?

Alyss: I want to say with a certain level of confidence that I didn't have any, wow, this isn't going to happen. I dunno how we're going to do this. I think instead for me, my challenges more were again, that I have so much on my plate, how am I going to manage all of this? I was like I doing co-pilot. I was co-authoring the 2021 State of the Octoverse with Dr. Nicole Forsgren and Eirini Kalliamvakou at the same time that I was doing this. While I was also managing the website creation for that, we were doing GitHub universe and I was leading blog post authoring and hosting for some of those things, launching Codespaces, hiring new people to the team.

It really didn't stop. And then again, still there's other product managers launching other products. I would do these consulting sessions like, Hey, what is it that you want to achieve?

What are the challenges you're running into? And every Monday we would have these products release meetings and I would sit in on those and find out what was coming in the upcoming weeks and say, Hey, what do you, and just try to help out the teams that were bringing other things to market even if I wasn't able to support those myself. And then yet again, because between all of these things, I'm still also interfacing with, we have a legal counsel that's embedded with our product teams at GitHub and that can be anywhere from, Hey, if we are adding this new product to the product portfolio, we need to think about what our terms of service are. And that's going to introduce some new data processing agreements. Hey, if we want to serve an EU market like yeah, we're going to have to have some data residency.

There's lots of asterisks in there. And so that was one of the relationships that I was managing and working with quite closely was what's the current opinion, where does that leave us in terms of features that we need to implement and billing requirements that we have. And actually I think one of the funniest fights that I got into in the whole process was the at 60 day trial. So as we were coming up to the window where we were going to launch copilot to ga, we had this discussion going in various documents and forums and the director over product for copilot had decided the 60 day window like fine, sure. And I realised that that was going to land the end of the trial on a Saturday and I was like, Hey, I don't want to do that to our support team.

It doesn't matter first, they're not going to be as well staffed. Second people are going to get in on a Monday if that's even when they're getting in and starting to use this if they do weekend projects. There's just so many things that could go wrong there. It would be so much better for people to see that their last day of the trial is a Monday and go into our credit card. And we could also see this from a trend in past GitHub usage patterns of people normally use it during weekdays. And I got a lot of initial pushback. I couldn't get that person to budge on that. And so I brought that conversation into an executive discussion like an e-staff and we were about to wrap the call and they're like, is there anything else?

And I'm like, oh, actually the free trial is going to end on a Saturday. And they're like, oh, we'll just make it a Monday. And I'm like, great idea.

So that was not so much a pit of despair thing, but I was frustrated because I knew that it was going to negatively impact the billing engineering team that had already done some of the work to get us to that point of the 60 days and they were going to have to redo some of it. And I wish I had pushed probably harder. And I think that's one of the more frequent lessons that I think I learned through part of this process is especially as a woman who grew up in the us, I was socialised in this very feminine, non-hierarchical way of you try to share the load, you trying not to be the queen bee. And that doesn't net out necessarily the best in a very dev tools like product and engineering space more often than not being loud, being a little bit more opinionated, fighting for what you want and what's right and not doing it in an obnoxious way.

You still need people to like you, relationships are still very critical, but doing some of those things early and often and thinking about the second order and third order effects of the decisions that you have and the teams that it's going to affect and making sure that you fight for those teams even when they're not in the room and getting ahead of those decisions. If I know that it's going to affect the billing engineering team, Hey, this is coming. I'm sorry that it's coming. Is there anything else that I can do to make it easier on you? Trying to give those teams a heads up was those are all things that I've tried to implement elsewhere whenever it's possible.

Adam: Yeah, and it sounds like saying it not just once was important in this particular instance. So someone listening is about to go into maybe not a GitHub copilot, but still an important launch for their company and their product. So thinking back to the time when you were put into that spot, what advice would you give to yourself or definitely to those listening as they enter that period for in their own professional lives

Alyss: As marketers, product marketers, product managers, almost any role, actually, I can't emphasise enough, the importance of hearing from real users who you think who are going to be your users. Obviously we don't always know who our ideal customer profile is going to be if we haven't started selling a product depending on the stage those things are in, but there's no replacement for it. And the gap that I see that can grow from teams that do not have those conversations is incredibly vast from teams that are having them. The other advice is the more that you can set aside your relationship to your ego and realise that the work that you're doing is topic conflict.

You're working on software, it is not life or death. It's it's not a hospital. It's computers. It's somebody else's computers. It's ones and zeros. It's not necessarily that important. Your identity isn't actually anchored in it or it shouldn't be. Your self-worth isn't based off of whether or not people think you that.

You're right. Ask the question, find out what it is you need to know. The more questions that you can ask, the more that you can better understand what it is that you're trying to sell, what the goals are, what the goal of the teams that you're working with are, the future, the things that are going on in the market, what your competitors are doing. And there's obviously a measure to these things. You can't take in too many inputs. There has to be a point at which you stop. But that willingness to ask questions and to set aside that fear of being embarrassed or being fearful.

And again, that topic conflict, it's topic conflict, not people conflict. People aren't looking to fight with you. It's a fight about the subject matter. And so that advice, I think is stuff that I'm still trying to reinforce and keep in mind because our ability to separate the topic conflict and the people conflict and how that intertwines with our ability to progress in our careers is quite difficult. It's not always things that are organisational environment allows us to do, but you're an important part in the organisational fabric and the organisational psychology and the psychological safety of the way that these launches happen. And that itself is important to the right wins and the right outcomes in the way that launches happen.

Matthew: Amazing. Thank you. Alyss, where can people find you?

Alyss: I am precisely Alyss on blue sky. I have not been particularly active. I'm Alyss Noland on LinkedIn. I'm a little bit more active there, otherwise probably don't look for me.

Adam: And it's A-L-Y-S-S, right?

Alyss: Yes. Thank you, Adam. Yes. My first name is A-L-Y-S-S because I decided to make it hard for everyone. And then no land, like zero property as a funny little Alyss Underland joke for everyone.

Adam: Excellent. Well thanks for joining us. Yeah, thank you.

Alyss: Thank you.