-
Learning by doing
-
Trainers with practical experience
-
Classroom training
-
Detailed course material
-
Clear content description
-
Tailormade content possible
-
Training that proceeds
-
Small groups
The course Open Source AI dives into the power of open-source LLMs like DeepSeek, Mistral, and LLaMA. Participants learn model selection, prompting, fine-tuning, deployment, and how to build practical AI tools using accessible frameworks and APIs.
This module explores popular open-source models like DeepSeek, Mistral, and LLaMA. It compares architectures, discusses model hubs like Hugging Face, performance, token usage, pricing, and responsible deployment practices.
Participants learn how to install and configure DeepSeek, use multilingual features, and apply basic prompting. The module also covers integration, model compression, performance tips, and deployment options for developers.
Explore prompting techniques including zero-shot, few-shot, chaining, and function calling. Learn about LangChain, RAG pipelines, custom memory, vector stores, and managing chat history and agent flows efficiently.
This module focuses on fine-tuning workflows using datasets, LoRA, and PEFT. You'll explore training via Colab or AWS, testing outputs, evaluating prompts, embeddings, and enabling models to evolve through continuous learning.
Participants learn how to deploy models using FastAPI, Streamlit, and Docker. It also covers optimization, edge/cloud strategies, local setups, performance monitoring, version control, and cost-aware deployment planning.
See real-world applications like legal summarizers, healthcare chatbots, and multilingual generators. Other cases include AI CRMs, educational bots, retrieval tools, and open-source copilots with embedded memory.
The course Open Source AI is intended for developers, data scientists, machine learning engineers, and AI enthusiasts who want to work with open source AI tools.
To participate in the course, basic knowledge of Python and data analysis is required. Experience with machine learning or neural networks is beneficial.
The course is conducted under the guidance of an experienced trainer, with theory and practice alternating. Practical examples and case studies are used for illustration.
After successfully completing the course, participants will receive a certificate of participation in the course Open Source AI.
Module 1: Overview of Open LLMs |
Module 2: Getting Started with DeepSeek |
Module 3: Prompting and Tooling |
DeepSeek, Mistral, Mixtral, and LLaMA Benefits of open-source AI Architecture comparisons Use cases and performance Hugging Face and model hubs Responsible deployment Token limits and pricing Embeddings and tokenizers Current limitations Benchmarking tools |
DeepSeek architecture Installing and configuring Sample use cases Prompting strategies Tools and APIs Multilingual capabilities Performance tips Model compression Developer integrations Deployment options |
Zero-shot vs few-shot Prompt chaining Function calling LangChain basics RAG workflows Vector databases Indexing content Custom memory solutions Chat history management Agent architecture |
Module 4: Fine-Tuning Open Models |
Module 5: Deployment and Scaling |
Module 6: Case Studies |
Dataset preparation Supervised fine-tuning LoRA and PEFT Training pipelines Using Colab/AWS for training Evaluation and testing Prompt evaluation Embedding evaluation Real-world use cases Continuous learning |
API wrappers Using FastAPI with models Streamlit for frontends Dockerized deployments Resource optimization Running locally Edge vs cloud deployment Monitoring performance Versioning models Cost considerations |
Legal document summarizer Healthcare chatbot AI-powered CRM assistant Multilingual content generator Financial insights analyzer Open-source copilot Email generator with memory Search-augmented assistants Education and tutoring bots Knowledge retrieval systems |