Code generation, programming assistants

October 14, 2021 — September 24, 2024

faster pussycat
language
machine learning
making things
neural nets
NLP
real time
signal processing
stringology
time series
UI

A cousin to neural automata: writing machines to code for us, because code generation is fancy text generation, which ends up involving similar technology, i.e. large language models.

There are two aspects to making this go: the model and the interface. Sometimes the two are combined, as in Cursor, which makes it hard to structure this page.

1 Security

Let us start with the obvious and important thing. I am vaguely concerned about how much of the world is uploading their source code for everything to these code servers. The potential for abuse is huge.

Figure 1

2 Interfaces

2.1 Github copilot interface

See Github copilot.

2.2 Cursor interface

See Cursor.

2.3 Fauxpilot

FauxPilot: Like GitHub Copilot without Microsoft telemetry • The Register

Updated GitHub Copilot, one of several recent tools for generating programming code suggestions with the help of AI models, remains problematic for some users due to licensing concerns and to the telemetry the software sends back to the Microsoft-owned company.

fauxpilot/fauxpilot: FauxPilot - an open-source alternative to GitHub Copilot server

This is an attempt to build a locally hosted alternative to GitHub Copilot. It uses the SalesForce CodeGen models inside of NVIDIA’s Triton Inference Server with the FasterTransformer backend.

Being able to work offline would be a real win; Copilot loves bandwidth too much.

2.4 Continue

For Jetbrains and VS Code IDEs, Continue is a plugin that provides AI-powered code completions. Seems to support BYO model.

2.5 Cody

Cody | AI coding assistant

Cody supports the most powerful LLMs including Claude 3.5, GPT-4o, Gemini 1.5, and Mixtral-8x7B.

You can also bring your own LLM key with Amazon Bedrock and Azure OpenAI.

3 Models and serving them

3.1 Github copilot

GitHub Copilot uses suggestions from OpenAI Codex some GPT4 off-the-shelf thing now IIRC to suggest code completions.

Behind a firewall, we require at least the following whitelist exceptions:

  • vscode-auth.github.com
  • api.github.com
  • copilot-proxy.githubusercontent.com

See Networked VS Code for some more whitelist rules we need for VS Code generally.

3.2 Cursor

Cursor - The AI-first Code Editor

Cursor is an AI-powered code editor that helps you build software faster.

It is a VS Code fork with its own AI engine (“Copilot++”, cheeky) and some extra UI affordances.

3.3 Codeium

Codeium

Codeium has been developed by the team at Exafunction to build on the industry-wide momentum on foundational models. We realised that the combination of recent advances in generative models and our world-class optimised deep learning serving software could provide users with top quality AI-based products at the lowest possible costs (or ideally, free!).

3.4 Codestral Mamba

Codestral Mamba | Mistral AI | Frontier AI in your hands

Following the publishing of the Mixtral family, Codestral Mamba is another step in our effort to study and provide new architectures. It is available for free use, modification, and distribution, and we hope it will open new perspectives in architecture research. Codestral Mamba was designed with help from Albert Gu and Tri Dao.

Unlike Transformer models, Mamba models offer the advantage of linear time inference and the theoretical ability to model sequences of infinite length. It allows users to engage with the model extensively with quick responses, irrespective of the input length. This efficiency is especially relevant for code productivity use cases—this is why we trained this model with advanced code and reasoning capabilities, enabling it to perform on par with SOTA transformer-based models.

3.5 Ollama/LLaama coder

Two offline things that work well together:

Ollama

Run Llama 3.1, Phi 3, Mistral, Gemma 2, and other models. Customise and create your own.

Llama Coder

Llama Coder is a better and self-hosted Github Copilot replacement for VS Code. Llama Coder uses Ollama and codellama to provide autocomplete that runs on your hardware. Works best with Mac M1/M2/M3 or with RTX 4090.

3.6 Amazon CodeWhisperer

AI Code Generator - Amazon CodeWhisperer - AWS

Available as part of the AWS Toolkit for Visual Studio (VS) Code and JetBrains, CodeWhisperer currently supports Python, Java, JavaScript, TypeScript, C#, Go, Rust, PHP, Ruby, Kotlin, C, C++, Shell scripting, SQL and Scala. In addition to VS Code and the JetBrains family of IDEs—including IntelliJ, PyCharm, GoLand, CLion, PhpStorm, RubyMine, Rider, WebStorm, and DataGrip—CodeWhisperer is also available for AWS Cloud9, AWS Lambda console, JupyterLab and Amazon SageMaker Studio.

Free for individual use.

3.7 Others

4 Pedagogy

Coding assistants are a great way to learn to code (if that is a thing that is still valuable to do?)

5 Incoming

Figure 2

Querying Glean:

Glean is a system for working with facts about source code. It is designed for collecting and storing detailed information about code structure, and providing access to the data to power tools and experiences from online IDE features to offline code analysis.

For example, Glean could answer all the questions you’d expect your IDE to answer, accurately and efficiently on a large-scale codebase. Things like:

  • Where is the definition of this method?
  • Where are all the callers of this function?
  • Who inherits from this class?
  • What are all the declarations in this file?

6 References

Beurer-Kellner, Fischer, and Vechev. 2022. Prompting Is Programming: A Query Language For Large Language Models.”
Bubeck, Chandrasekaran, Eldan, et al. 2023. Sparks of Artificial General Intelligence: Early Experiments with GPT-4.”
Din, Karidi, Choshen, et al. 2023. Jump to Conclusions: Short-Cutting Transformers With Linear Transformations.”
Suzgun, Scales, Schärli, et al. 2022. Challenging BIG-Bench Tasks and Whether Chain-of-Thought Can Solve Them.”
Wang, Wei, Schuurmans, et al. 2023. Self-Consistency Improves Chain of Thought Reasoning in Language Models.”