GitHub Universe 2024 opening keynote: delivering phase two of AI code generation

31 Oct 2024 (15 days ago)
GitHub Universe 2024 opening keynote: delivering phase two of AI code generation

GitHub Universe timeline of key product launches (0s)

GitHub CEO Thomas Dohmke intro on phase two of AI and software development (54s)

  • GitHub Universe is being held for the 10th time, and it's the first time at Fort Mason, marking a decade of open-source creation and consumption, with open source having "won" as it can no longer be denied that it has become a major force in the industry (1m9s).
  • Since 2015, developers have solved big problems by closing over 67 million GitHub issues, merging over 180 million pull requests, and sparking over 320 million public projects on GitHub, making open source the greatest team sport on Earth (2m5s).
  • The age of AI has not just begun, but has crossed the chasm, with a 98% year-over-year growth in AI creation in 2024, and Python becoming the number one programming language on GitHub, surpassing JavaScript (3m11s).
  • The primary use case for GitHub is expanding, with open source creation becoming a big trend for data science, academia, and research, and the programming language of AI is on the rise (3m49s).
  • GitHub has evolved into the world's largest creator network for the age of AI, with developers using AI to build AI, and laying the blueprint for what it means to partner with intelligent machines (4m9s).
  • Copilot, launched three years ago, is the most adopted AI developer tool on the planet, and is now launching into its next phase, with the first phase of AI code generation being built on three key pillars: AI-infused, conversational coding, and multimodel functionality (4m43s).
  • Phase two of AI code generation has arrived, with the partnership between developers and AI going to the next level, and AI becoming native to the entire developer experience, with AI-native developer workflows and agentic coding, where AI assistance executes tasks at the developer's direction in natural language (6m11s).

Multi-model choice for developers (7m0s)

  • GitHub is moving from multimodal functionality to multimodal choice in phase two, allowing developers to choose the models that work best for them (7m1s).
  • There has been a boom of high-quality models in 2024, and GitHub is responding by providing developers with the agency to build with the models they prefer (7m8s).
  • GitHub is an open developer platform, and developers expect to have the freedom to choose the models that suit their needs (7m15s).
  • GitHub is delivering multimodal choice by providing public access to 01 Preview and 01 Mini for all Co-Pilot users (7m23s).
  • 01 Preview has been found to have strong reasoning capabilities, allowing for a deeper understanding of code constraints and edge cases, and producing efficient and quality results (7m51s).
  • Developers can now use 01 in Co-Pilot Chat by simply clicking on it and getting started (8m2s).

Anthropic's Claude 3.5 Sonnet + GitHub Copilot (8m9s)

  • A model has gained significant attention from developers this year, specifically Claude 3.5 Sonnet, developed by Anthropic. (8m9s)
  • A new partnership between GitHub and Anthropic is being announced, which brings the power of Claude to GitHub Copilot. (8m21s)
  • As a result of this partnership, users can select Claude in the model palette, starting today. (8m35s)
  • The integration of Claude with GitHub Copilot is now available on users' computers. (8m39s)

GitHub CEO Thomas Dohmke + Anthropic Co-Founder Jared Kaplan (8m45s)

  • Jared Kaplan, co-founder and chief scientist of Anthropic, was welcomed on stage, and his background as a theoretical physicist was discussed (8m46s).
  • Kaplan explained that he was always fascinated by science and AI, but initially pursued physics due to the lack of tools for software development and AI research at the time (9m10s).
  • He eventually transitioned to AI after being inspired by friends working in the field, including Dario Amodei, who was leading research at Open AI (9m41s).
  • Kaplan discussed the inception of Anthropic and the development of Claude 3.5 Sonnet, which was designed to make AI models more accessible to developers (10m4s).
  • The goal of Anthropic is to steer AI in a beneficial direction and make frontier model capabilities available to as many people as possible, as quickly as possible (10m12s).
  • Claude 3.5 Sonnet was optimized for software engineering and is intended to be a great model for developers to work with (10m57s).
  • Kaplan believes that the partnership between GitHub and Anthropic will be amazing for developers, providing them with access to the latest tools and models (11m5s).
  • He emphasized the importance of getting hands-on experience with AI tools, rather than just reading academic papers, and encouraged physicists to learn tools like GitHub (11m26s).
  • Kaplan discussed the roadmap for Claude, which includes improving the complexity of tasks that AI models can perform and expanding the environments in which they can work (11m49s).
  • Future releases are expected to enable models to perform more complex workflows, work alongside humans, and collaborate in various environments, including software testing and app development (12m10s).
  • Kaplan expressed excitement about Claude being able to work alongside humans in many different contexts (12m46s).

Google's Gemini 1.5 Pro + GitHub Copilot (13m11s)

  • A partnership has been formed with Google to bring Gemini 1.5 Pro to GitHub Copilot, allowing developers to build with this new model in the future (13m21s).
  • Gemini 1.5 Pro is natively multimodal, enabling it to process code, images, video, and text simultaneously, making it suitable for code suggestions, writing documentation, and explaining code (13m36s).
  • The multimodal capabilities of Gemini 1.5 Pro will be integrated into many of GitHub Copilot's surface areas in the coming weeks, providing developers with more options and flexibility (13m50s).
  • Gemini 1.5 Pro will be available in GitHub Copilot's new model Pickers, allowing developers to choose the model that best suits their needs (13m31s).

GitHub Copilot in Visual Studio Code (14m0s)

  • GitHub Copilot has been enhanced in Visual Studio Code to provide a better experience, with the goal of reducing context switching and increasing productivity (14m0s).
  • Cassidy, the presenter, showcases the new features of GitHub Copilot, highlighting its evolution into an intelligent platform with multiple tools and services (14m24s).
  • GitHub Copilot is demonstrated on a website called CoPilot Airways, where the presenter uses it to autocomplete code and explain its functionality in different languages (15m5s).
  • The presenter uses GitHub Copilot to ask for suggestions on how to improve a table on the profile page, and the AI tool suggests using a line graph that already exists in the codebase (16m49s).
  • GitHub Copilot's intent detection feature is showcased, where it figures out what the user is asking and uses the workspace agent and repo indexing to provide relevant suggestions (17m0s).
  • The presenter decides to create a new component, an area graph, and notes that this would require multiple changes to the code, including creating a new component, changing the profile page, adding tests, and updating the README (17m21s).
  • The presenter sets the stage for demonstrating how GitHub Copilot can be used to edit multiple files at once, streamlining the development process (17m35s).

Multi-file editing (17m36s)

  • GitHub's Co-Pilot now supports multi-file editing, allowing users to make changes across multiple files at once (17m43s).
  • A new edit mode has been introduced, enabling users to select a model, such as Claude, and provide instructions for the AI to generate code (17m51s).
  • Users can provide specific instructions, such as creating a new reusable area graph from a points activity service, and the AI will generate the necessary code and tests (18m1s).
  • The AI will create step-by-step instructions for the code generation process, including creating a chart component, adding it to a profile page, and creating tests for the new component (18m28s).
  • The generated code is based on the existing codebase and includes green diffs to highlight new changes (18m41s).
  • The AI also generates unit tests for every single component, which can be reviewed and saved (19m0s).
  • The generated code can be accepted and saved, and the changes will be reflected in the relevant pages, such as the profile page (19m15s).
  • The feature includes a little feature flag under the hood, which can be accessed by going back to the monthly points area chart (19m31s).

Copilot Extensions (19m40s)

  • GitHub Copilot Extensions was released in May, allowing users to extend GitHub Copilot's capabilities to perform various tasks, such as integrating with feature flag services like Launch Darkly (19m41s).
  • The Launch Darkly GitHub Copilot extension enables users to create flags based on specific conditions, such as a monthly points chart, and toggle them directly within VS Code (20m2s).
  • GitHub Copilot Custom Instructions is a feature that allows users to add custom instructions to their GitHub repository using a markdown file, enabling them to specify specific actions for Copilot to perform, such as adding certain docstrings or following specific linting rules (20m29s).
  • The custom instructions file can be used to automate various tasks, such as adding comments or ensuring that reusable React components include specific services (20m46s).
  • GitHub Copilot Extensions are available now, with a range of extensions to choose from, including those from Superbase, Atlassian, and Century, and users can also build their own extensions (22m5s).
  • The extension marketplace will continue to grow, offering users a wide range of choices, and GitHub Copilot's capabilities will extend beyond VS Code, with support for Apple developers (22m28s).

GitHub Copilot + Xcode (22m34s)

  • GitHub Copilot is available in Xcode today in public preview, allowing developers who build for the Apple ecosystem to leverage its power as they develop, test, and deploy (22m35s).
  • The integration of GitHub Copilot with Xcode aims to simplify the lives of developers, providing them with the agency to make their own choices, regardless of where they work (22m58s).
  • Developers can access GitHub Copilot in Xcode whether they work on mobile, desktop, remotely on GitHub.com, or locally on their machine (22m52s).

GitHub Models (23m7s)

  • GitHub co-pilot is being expanded to include multimodal choice, allowing developers to build on higher layers of abstraction with models of various sizes to stay up-to-speed (23m21s).
  • GitHub launched a new tool called GitHub models, an interactive playground for AI engineers to build applications right where their code is, allowing users to go from model to playground to deployment quickly (23m47s).
  • GitHub models was launched with the latest models in the industry, including French junqua with Mistral large and Mistral Nemo, as well as smaller language models like 53 (24m7s).
  • The platform also includes open-source models, such as Meta's Llama Suite, and has added more models from Co:here, Command, Bria AI, and Open AI (24m45s).
  • GitHub models allows users to explore and compare different language models (LLMs) to find the best one for their specific task, and Co-pilot can guide users in selecting the most suitable model (26m11s).
  • The app models extension in Co-pilot chat enables users to ask for advice on which model to use for a particular task, and Co-pilot will provide recommendations based on data from the model Marketplace (26m34s).
  • A demonstration of GitHub models and Co-pilot was shown, where a user wanted to extract JSON data from travel reviews and used Co-pilot to find the most suitable Mistral model for the task (26m48s).
  • GitHub models allow users to try out different models, such as Mr. LLaMA and Mr. Nemo, and compare their performance on specific tasks (27m0s).
  • The compare feature enables users to run two models in parallel for the same task and easily compare their results (27m30s).
  • Users can add system prompts to explain to the model what task to perform, and the prompt is automatically applied to both models being compared (27m58s).
  • After running the models in parallel, users can compare the results and choose the model that best suits their needs (28m27s).
  • The chosen model can then be used to generate code, such as extracting JSON data from a review, which can be easily integrated into an application (28m46s).
  • GitHub models provide a code tab where users can pick their language of choice and get a copy-and-paste code sample to interact with the model (29m26s).
  • Users can use their GitHub token for authentication, eliminating the need to sign up for new services or jump through hoops (29m39s).
  • When ready to go to production, users can get an API key for Azure AI, which has higher rate limits and can be scaled (29m50s).
  • GitHub models also include multimodal models, which can handle multiple types of input and output (30m6s).
  • Multimodal models are capable of accepting images as well as text, allowing for more diverse inputs and interactions (30m12s).
  • A demonstration of a multimodal model, Open AI's GPT 40 mini, is given, showcasing its ability to describe an image of the Golden Gate Bridge in one word: "Bridge" (30m30s).
  • The GitHub Multiverse is being extended, and GitHub models, including the app models extension, are now available to all GitHub users (31m1s).
  • With GitHub models, users will have access to the playground, side-by-side comparison, and more features to aid in their development (31m16s).
  • The availability of GitHub models aims to enable over 100 million developers worldwide to become AI engineers (31m22s).

Copilot Workspace (31m34s)

  • Nille, a high school senior, is the lead programmer for the robotics team Orange Dynamite, which has been working on a project for 7 months (31m34s).
  • Nille is interested in aerospace engineering and has found GitHub Copilot Workspace to be a valuable tool, surpassing other AI offerings in its ability to understand programming tasks and real-world solutions (31m50s).
  • GitHub Copilot Workspace represents the future of GitHub Copilot, allowing users to assign tasks and guide the AI as it implements solutions, making it possible to understand complex concepts in a matter of minutes (32m6s).
  • With GitHub Copilot Workspace, tasks that conventionally took a long time can now be completed quickly, and the tool is deeply integrated into Nille's workflow, making it hard to imagine working without it (32m21s).
  • Nille considers GitHub Copilot Workspace a game-changer that continues to surprise and improve their work (32m33s).

Copilot Workspace demo + GitHub Next (32m44s)

  • GitHub Next is the in-house research team building for the future of software development, and they launched a technical preview of Co-Pilot Workspace six months ago on April 29th (33m3s).
  • Co-Pilot Workspace is an AI-native development environment where users build with agents and the power of AI at every step from idea to execution, starting with an issue and having three agents helping: a spec agent, a plan agent, and an implementation agent (33m25s).
  • Since the launch, developers on the technical preview have created over 15,000 pull requests using Workspace, and over 10,000 of those pull requests have merged (34m6s).
  • Thousands of developers have engaged with Workspace, providing feedback that has been generous and helpful, whether positive or constructive (34m24s).
  • Much of the feedback has been positive, with some users sharing how Co-Pilot Workspace helped them bootstrap an app easily and shrink timelines from days to hours or minutes (34m46s).
  • Other feedback has been constructive, pointing out issues such as the difficulty of validating code inside Workspace, which the team is addressing (34m57s).
  • The team has made over 100 changes to Co-Pilot Workspace since the start of the technical preview, including redesigns, new features, paper cuts, and quality of life improvements (35m20s).
  • Two new agents have been added to the workflow: Brainstorm for ideation and another for building and repairing code, bringing the total number of agents to five (35m51s).
  • A VS Code extension has been released, and Workspace is being expanded to pull requests, with AI acting as a second brain and a truly helpful partner alongside developers (36m9s).
  • The next iteration of Co-Pilot Workspace is available, and it continues to build on the existing features and functionality (36m23s).
  • Co-Pilot Airways is being used as an example to demonstrate the capabilities of Co-Pilot Workspace, with a feature being added to allow images to be added to each post (36m47s).
  • Co-Pilot Workspace allows users to work on issues and implement features without having to create a new branch and start thinking about implementation from scratch (37m25s).
  • GitHub's Copilot Workspace is a tool that helps developers implement issues and features in their code by providing a specification of how to implement the issue, and it can be used iteratively based on the context provided by the brainstorm feature (37m29s).
  • The brainstorm feature shares suggested questions and ideas to get started with implementing an issue, and users can also ask their own questions to generate more ideas (37m52s).
  • Copilot Workspace can derive context from multiple sources, including GitHub issues and brainstorming sessions, to generate a plan for implementing a feature (38m34s).
  • The plan generated by Copilot Workspace includes a list of changes to make in which files, and users can edit the plan if needed before asking Copilot to implement it (38m48s).
  • Copilot Workspace can make changes to multiple files across a project, and users can ask for revisions to the plan and implementation by talking to Copilot in natural language (39m31s).
  • Copilot Workspace supports multiple languages, including Hindi, and can understand and respond to user requests in those languages (39m43s).
  • The tool also allows users to run tests and debug their code, and can even help fix errors by suggesting solutions with the new build and repair agent (40m31s).
  • Users can configure commands for building, testing, and running their code within Copilot Workspace, and can also run tests and debug their code within the tool (40m33s).
  • If a test fails, Copilot Workspace can help users fix the error by suggesting a solution, and users can apply the suggested fix and re-run the test to see if it resolves the issue (40m56s).
  • Once a feature is implemented and tested, users can preview what they've built by running a command within Copilot Workspace, which can suggest a command if one hasn't been configured already (41m44s).
  • GitHub is showcasing the capabilities of its co-pilot feature, specifically co-pilot workspace, which allows developers to brainstorm, build, and repair code in a natural language environment (42m36s).
  • The co-pilot workspace feature enables developers to work efficiently, moving as fast as their creativity, and is intended to be the developer environment of the future (42m55s).
  • Co-pilot workspace is expanding to include pull requests, allowing developers to address feedback from their team and AI agents, making it a part of the development team (43m5s).
  • The feature is designed with the developer at the center and has the potential to be the single most revolutionary thing built for professional developers since the inception of GitHub (44m16s).
  • Co-pilot workspace is being further developed to make it accessible to people who want to build software but cannot write code, making coding more inclusive (44m36s).
  • The demos shown were live deployed to production on GitHub.com, and the company looks forward to getting co-pilot workspace into the hands of more developers (43m56s).
  • The co-pilot workspace feature is part of the next iteration of co-pilot and is expected to be a game-changer for professional developers (43m32s).
  • The feature is being developed with the goal of making coding more accessible and efficient, and GitHub is excited about its potential impact on the development community (44m10s).

GitHub Spark (44m50s)

  • GitHub Spark is an AI-native tool that allows users to build applications entirely in natural language, blurring the line between developers and everyday people (44m50s).
  • With GitHub Spark, anyone can create and share an application in a matter of minutes, regardless of their experience (45m12s).
  • GitHub Spark is not a classic IDE, and users do not need to install programming languages or figure out how to use the terminal (45m32s).
  • The tool features an input field for creating applications, a model picker with different models, and a list of recent Sparks (45m39s).
  • Users can create various applications, such as games, using natural language prompts (45m55s).
  • A live demo showcased the creation of a Tic Tac Toe game with ducks and hippos using GitHub Spark (46m8s).
  • The model behind the scenes does all the work, and the application is streamed in digitally, with a console available for refinement (46m13s).
  • Users can refine their applications, add themes, change the scale, and modify the accent color (47m1s).
  • GitHub Spark also allows users to add custom instructions, store data in a key-value store backed by Azure Cosmos DB, and scan conference badges (47m21s).
  • A demo app was created using GitHub Spark, which scans conference badges and displays the user's information (47m35s).
  • The app was created entirely in natural language, and every step of the process is visible, including all the prompts used (47m41s).
  • The app is functional and can be used on a mobile device (48m22s).

Closing remarks (48m44s)

  • A new wave of personal applications is on the horizon, and the floodgates of innovation have swung wide open, as demonstrated in the presentation (48m49s).
  • For a long time, there has been a significant barrier separating a vast majority of the world's citizens from creating software, due to factors such as language, educational access, or the perceived difficulty of coding (49m6s).
  • This barrier has limited opportunities in the technology economy, making the industry a "Walled Garden" (49m35s).
  • GitHub Spark aims to put one billion cracks in this concrete barrier, and with the help of the community, it is possible to break this barrier and create a more inclusive environment (49m40s).
  • The goal is to realize a world where not just 100 million but one billion people on GitHub can build software together (50m5s).
  • The new creator Network is designed for the age of AI and is intended to be a creator Network for all, which is GitHub's guiding aspiration (50m16s).
  • GitHub is committed to making this vision a reality and invites the community to enjoy the 10th Annual GitHub Universe, with more content and announcements to come (50m31s).

Overwhelmed by Endless Content?