AI is one of the key parts of our sector, it is becoming more and more relevant and it is important to be trained and updated.

In this section, you will find some of the most interesting resources. If you want to add a new one, go to the #ai slack channel and post your resource there and if you want to discuss about AI with other colleagues, use the #ai-adoption-project channel.

Table of contents

Courses

Title Description Link Author Topic
ChatGPT Prompt Engineering for Developers - Learn prompt engineering best practices for application development

Tools & Services

Tool Description Topic Link
Codium Generating meaningful tests for busy devs.
With CodiumAI, you get non-trivial tests (and trivial, too!) suggested right inside your IDE, so you can code smart, create more value, and stay confident when you push. Automatic testing https://github.com/AntonOsika/gpt-engineer
Photo Prism PhotoPrism is an AI-Powered Photos App for the Decentralized Web. It makes use of the latest technologies to tag and find pictures automatically without getting in your way. You can run it at home, on a private server, or in the cloud. Images https://docs.photoprism.app/
There’s an AI for that The largest AI aggregator AI https://theresanaiforthat.com/
Cook Up Discover AI apps for every use case AI https://cookup.ai/
Hash node GPT chat service focus on solving development problems Development, GPT https://hashnode.com/rix
Phind Your personal search-enabled coding agent based on GPT Development, GPT https://www.phind.com/
Chatling Answer customers instantly and accurately with AI chatbots that can be trained on your website content, documents, knowledge base, and other resources (automatically). Chatbot https://chatling.ai/
AI Component Generator Ask for any component and the AI will generate it for you GPT, Development https://ai2ui.co/
CommitGPT Automatically generate commit messages using ChatGPT. GPT, Development https://github.com/RomanHotsiy/commitgpt

Videos

Tool Description Topic Link
AI/ML Security & OSS Presentation on AI/ML security issue Security, ML https://www.youtube.com/watch?v=kTMgG5gn-oU#t=10m52s&ab_channel=OpenSSF
Neural Networks from the ground up Video to understand how neural networks works from the basics Neural Networks https://www.youtube.com/watch?v=aircAruvnKk
Why GitHub Copilot? Learning journey Online session series on GitHub Copilot. Ideal for non-tech people and also for tech people. Copilot

Readings (Articles, posts, papers…)

Title Description Topic Link
Learn prompting Prompt engineering (PE) is the process of communicating effectively with an AI to achieve desired results. As AI technology continues to rapidly advance, the ability to master prompt engineering has become a particularly valuable skill. Prompt engineering techniques can be applied to a wide variety of tasks, making it a useful tool for anyone seeking to improve their efficiency in both everyday and innovative activities.

This course is tailored to beginners, making it the perfect starting point if you're new to AI and PE. However, even if you're not a beginner, you'll still find valuable insights within this course. This course is the most comprehensive prompt engineering course available, and the content ranges from an introduction to AI to advanced PE techniques. | Prompting | https://learnprompting.org/docs/intro | | Fast.ai | Blog with recurring posts about neutral nets | Neural networks | https://www.fast.ai/ | | Uncensored Models | When I talk about a model, I'm talking about a huggingface transformer model, that is instruct trained, so that you can ask it questions and get a response. What we are all accustomed to, using ChatGPT. Not all models are for chatting. But the ones I work with are.

The reason these models are aligned is that they are trained with data that was generated by ChatGPT, which itself is aligned by an alignment team at OpenAI. As it is a black box, we don't know all the reasons for the decisions that were made, but we can observe it generally is aligned with American popular culture, and to obey American law, and with a liberal and progressive political bias. | AI Models | https://erichartford.com/uncensored-models | | OpenAI Cookbook | The OpenAI Cookbook shares example code for accomplishing common tasks with the OpenAI API. | OpenAI, GPT, Images | https://github.com/openai/openai-cookbook/tree/main | | OpenAI Evals | Evals is a framework for evaluating LLMs (large language models) or systems built using LLMs as components. It also includes an open-source registry of challenging evals. | OpenAI, Evals, LLM | https://github.com/openai/evals | | Building an Image Recognition App in Javascript using Pinecone, Hugging Face, and Vercel | Article about how to develop with JS an image recognition system. | Images | https://dev.to/rschwabco/building-an-image-recognition-app-in-javascript-using-pinecone-hugging-face-and-vercel-2b0p | | Building LLM applications for production | Part 1 discusses the key challenges of productionizing LLM applications and the solutions that I’ve seen. Part 2 discusses how to compose multiple tasks with control flows (e.g. if statement, for loop) and incorporate tools (e.g. SQL executor, bash, web browsers, third-party APIs) for more complex and powerful applications. Part 3 covers some of the promising use cases that I’ve seen companies building on top of LLMs and how to construct them from smaller tasks. | LLM | https://huyenchip.com/2023/04/11/llm-engineering.html | | Using GPT-3 for Search and Recommendations of Text Content | We will be discussing how to use GPT-3 vectors for a recommendation system that utilizes cosine similarity to find similar documents. | GPT | https://betterprogramming.pub/using-gpt-3-for-search-and-recommendations-7987d8908331 | | LongNet: Scaling Transformers to 1,000,000,000 Tokens | Scaling sequence length has become a critical demand in the era of large language models. However, existing methods struggle with either computational complexity or model expressivity, rendering the maximum sequence length restricted. In this work, we introduce LongNet, a Transformer variant that can scale sequence length to more than 1 billion tokens, without sacrificing the performance on shorter sequences. Specifically, we propose dilated attention, which expands the attentive field exponentially as the distance grows. LongNet has significant advantages: 1) it has a linear computation complexity and a logarithm dependency between tokens; 2) it can be served as a distributed trainer for extremely long sequences; 3) its dilated attention is a drop-in replacement for standard attention, which can be seamlessly integrated with the existing Transformer-based optimization. Experiments results demonstrate that LongNet yields strong performance on both long-sequence modeling and general language tasks. Our work opens up new possibilities for modeling very long sequences, e.g., treating a whole corpus or even the entire Internet as a sequence. | AI Models | https://arxiv.org/abs/2307.02486 |

Repositories

Title Description Topic Link
GPT Engineer GPT Engineer is made to be easy to adapt, extend, and make your agent learn how you want your code to look. It generates an entire codebase based on a prompt. GPT https://github.com/AntonOsika/gpt-engineer
Pythagora Pythagora is on a mission to make automated tests fully autonomous.
Just run one command and watch the tests being created with GPT-4 Automatic testing https://github.com/Pythagora-io/pythagora
Project Miyagi Project Miyagi serves as the foundation for an envisioning workshop that reimagines the design, development, and deployment of intelligent applications using Microsoft's Copilot stack. It demonstrates that integrating intelligence transcends a simple chat interface and permeates every aspect of your product experience, utilizing semantic data to generate personalized interactions and effectively address individual needs. Through a comprehensive exploration of generative and discriminative use cases, Miyagi offers hands-on experience with cutting-edge programming paradigms that harness the power of foundation models in every workflow. Additionally, it introduces traditional software engineers to emerging design patterns in prompt engineering (chain-of-thought, few-shot, retrieval-augmentation), vectorization for long-term memory, and tools or affordances to augment and ground LLMs. Copilot https://github.com/Azure-Samples/miyagi
LLM Bootstrap knowledge of LLMs ASAP. With a bias/focus to GPT.
Avoid being a link dump. Try to provide only valuable well-tuned information. LLM https://gist.github.com/rain-1/eebd5e5eb2784feecf450324e3341c8d
Alpaca Electron Alpaca Electron is built from the ground up to be the easiest way to chat with the alpaca AI models. No command line or compiling is needed! GPT, AI Models https://github.com/ItsPi3141/alpaca-electron
Quivr Dump all your files and chat with it using your Generative AI Second Brain using LLMs ( GPT 3.5/4, Private, Anthropic, VertexAI ) & Embeddings LLM, GPT https://github.com/StanGirard/quivr
LLM Client About LLMClient - Simplify Building with LLMs, Function Calling and Reasoning LLM https://github.com/dosco/llm-client
InvokeAI About InvokeAI is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, supports terminal use through a CLI, and serves as the foundation for multiple commercial products. Images https://github.com/invoke-ai/InvokeAI
AzureAI A hub with a curated awesome list of all Azure AI samples OpenAI https://github.com/Azure-Samples/azure-ai
AutoGPT An experimental open-source attempt to make GPT-4 fully autonomous. GPT https://github.com/Significant-Gravitas/Auto-GPT

Others (Slack channels, Cheat sheets…)

One Beyond Slack Channels

#guild-ai - Meeting point to share knowledge about AI, interesting links, potential projects… whatever you want related to AI!