Application of 1 bit LLM model
1️⃣ In a remote village, a student can use a mobile device with a 1-bit LLM to get personalized tutoring without internet access.
2️⃣ In a low-resource clinic, healthcare workers use a mobile app with a 1-bit LLM to diagnose common diseases from symptoms or images offline.
3️⃣ Farmers use a 1-bit LLM app to diagnose crop diseases and receive personalized farming advice based on soil type and weather patterns
4️⃣ In a disaster-prone area, a 1-bit LLM-powered app helps first responders and citizens communicate critical information in multiple languages offline
1️⃣ In a remote village, a student can use a mobile device with a 1-bit LLM to get personalized tutoring without internet access.
2️⃣ In a low-resource clinic, healthcare workers use a mobile app with a 1-bit LLM to diagnose common diseases from symptoms or images offline.
3️⃣ Farmers use a 1-bit LLM app to diagnose crop diseases and receive personalized farming advice based on soil type and weather patterns
4️⃣ In a disaster-prone area, a 1-bit LLM-powered app helps first responders and citizens communicate critical information in multiple languages offline
Many data scientists don't know how to push ML models to production. Here's the recipe 👇
𝗞𝗲𝘆 𝗜𝗻𝗴𝗿𝗲𝗱𝗶𝗲𝗻𝘁𝘀
🔹 𝗧𝗿𝗮𝗶𝗻 / 𝗧𝗲𝘀𝘁 𝗗𝗮𝘁𝗮𝘀𝗲𝘁 - Ensure Test is representative of Online data
🔹 𝗙𝗲𝗮𝘁𝘂𝗿𝗲 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 𝗣𝗶𝗽𝗲𝗹𝗶𝗻𝗲 - Generate features in real-time
🔹 𝗠𝗼𝗱𝗲𝗹 𝗢𝗯𝗷𝗲𝗰𝘁 - Trained SkLearn or Tensorflow Model
🔹 𝗣𝗿𝗼𝗷𝗲𝗰𝘁 𝗖𝗼𝗱𝗲 𝗥𝗲𝗽𝗼 - Save model project code to Github
🔹 𝗔𝗣𝗜 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 - Use FastAPI or Flask to build a model API
🔹 𝗗𝗼𝗰𝗸𝗲𝗿 - Containerize the ML model API
🔹 𝗥𝗲𝗺𝗼𝘁𝗲 𝗦𝗲𝗿𝘃𝗲𝗿 - Choose a cloud service; e.g. AWS sagemaker
🔹 𝗨𝗻𝗶𝘁 𝗧𝗲𝘀𝘁𝘀 - Test inputs & outputs of functions and APIs
🔹 𝗠𝗼𝗱𝗲𝗹 𝗠𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴 - Evidently AI, a simple, open-source for ML monitoring
𝗣𝗿𝗼𝗰𝗲𝗱𝘂𝗿𝗲
𝗦𝘁𝗲𝗽 𝟭 - 𝗗𝗮𝘁𝗮 𝗣𝗿𝗲𝗽𝗮𝗿𝗮𝘁𝗶𝗼𝗻 & 𝗙𝗲𝗮𝘁𝘂𝗿𝗲 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴
Don't push a model with 90% accuracy on train set. Do it based on the test set - if and only if, the test set is representative of the online data. Use SkLearn pipeline to chain a series of model preprocessing functions like null handling.
𝗦𝘁𝗲𝗽 𝟮 - 𝗠𝗼𝗱𝗲𝗹 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁
Train your model with frameworks like Sklearn or Tensorflow. Push the model code including preprocessing, training and validation scripts to Github for reproducibility.
𝗦𝘁𝗲𝗽 𝟯 - 𝗔𝗣𝗜 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 & 𝗖𝗼𝗻𝘁𝗮𝗶𝗻𝗲𝗿𝗶𝘇𝗮𝘁𝗶𝗼𝗻
Your model needs a "/predict" endpoint, which receives a JSON object in the request input and generates a JSON object with the model score in the response output. You can use frameworks like FastAPI or Flask. Containzerize this API so that it's agnostic to server environment
𝗦𝘁𝗲𝗽 𝟰 - 𝗧𝗲𝘀𝘁𝗶𝗻𝗴 & 𝗗𝗲𝗽𝗹𝗼𝘆𝗺𝗲𝗻𝘁
Write tests to validate inputs & outputs of API functions to prevent errors. Push the code to remote services like AWS Sagemaker.
𝗦𝘁𝗲𝗽 𝟱 - 𝗠𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴
Set up monitoring tools like Evidently AI, or use a built-in one within AWS Sagemaker. I use such tools to track performance metrics and data drifts on online data.
𝗞𝗲𝘆 𝗜𝗻𝗴𝗿𝗲𝗱𝗶𝗲𝗻𝘁𝘀
🔹 𝗧𝗿𝗮𝗶𝗻 / 𝗧𝗲𝘀𝘁 𝗗𝗮𝘁𝗮𝘀𝗲𝘁 - Ensure Test is representative of Online data
🔹 𝗙𝗲𝗮𝘁𝘂𝗿𝗲 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴 𝗣𝗶𝗽𝗲𝗹𝗶𝗻𝗲 - Generate features in real-time
🔹 𝗠𝗼𝗱𝗲𝗹 𝗢𝗯𝗷𝗲𝗰𝘁 - Trained SkLearn or Tensorflow Model
🔹 𝗣𝗿𝗼𝗷𝗲𝗰𝘁 𝗖𝗼𝗱𝗲 𝗥𝗲𝗽𝗼 - Save model project code to Github
🔹 𝗔𝗣𝗜 𝗙𝗿𝗮𝗺𝗲𝘄𝗼𝗿𝗸 - Use FastAPI or Flask to build a model API
🔹 𝗗𝗼𝗰𝗸𝗲𝗿 - Containerize the ML model API
🔹 𝗥𝗲𝗺𝗼𝘁𝗲 𝗦𝗲𝗿𝘃𝗲𝗿 - Choose a cloud service; e.g. AWS sagemaker
🔹 𝗨𝗻𝗶𝘁 𝗧𝗲𝘀𝘁𝘀 - Test inputs & outputs of functions and APIs
🔹 𝗠𝗼𝗱𝗲𝗹 𝗠𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴 - Evidently AI, a simple, open-source for ML monitoring
𝗣𝗿𝗼𝗰𝗲𝗱𝘂𝗿𝗲
𝗦𝘁𝗲𝗽 𝟭 - 𝗗𝗮𝘁𝗮 𝗣𝗿𝗲𝗽𝗮𝗿𝗮𝘁𝗶𝗼𝗻 & 𝗙𝗲𝗮𝘁𝘂𝗿𝗲 𝗘𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝗶𝗻𝗴
Don't push a model with 90% accuracy on train set. Do it based on the test set - if and only if, the test set is representative of the online data. Use SkLearn pipeline to chain a series of model preprocessing functions like null handling.
𝗦𝘁𝗲𝗽 𝟮 - 𝗠𝗼𝗱𝗲𝗹 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁
Train your model with frameworks like Sklearn or Tensorflow. Push the model code including preprocessing, training and validation scripts to Github for reproducibility.
𝗦𝘁𝗲𝗽 𝟯 - 𝗔𝗣𝗜 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁 & 𝗖𝗼𝗻𝘁𝗮𝗶𝗻𝗲𝗿𝗶𝘇𝗮𝘁𝗶𝗼𝗻
Your model needs a "/predict" endpoint, which receives a JSON object in the request input and generates a JSON object with the model score in the response output. You can use frameworks like FastAPI or Flask. Containzerize this API so that it's agnostic to server environment
𝗦𝘁𝗲𝗽 𝟰 - 𝗧𝗲𝘀𝘁𝗶𝗻𝗴 & 𝗗𝗲𝗽𝗹𝗼𝘆𝗺𝗲𝗻𝘁
Write tests to validate inputs & outputs of API functions to prevent errors. Push the code to remote services like AWS Sagemaker.
𝗦𝘁𝗲𝗽 𝟱 - 𝗠𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴
Set up monitoring tools like Evidently AI, or use a built-in one within AWS Sagemaker. I use such tools to track performance metrics and data drifts on online data.
This week we are going to upload the video playlist on how to building Agentic Ai Projects on our YouTube channel.
DO SUBSCRIBE TO OUR YOUTUBE CHANNEL
https://youtube.com/@dataspoof1977?si=6CdAg1x6mvPxqG6-
DO SUBSCRIBE TO OUR YOUTUBE CHANNEL
https://youtube.com/@dataspoof1977?si=6CdAg1x6mvPxqG6-
YouTube
DataSpoof
Hello world, it's Abhishek! I am a Data Scientist | Corporate Trainer on a mission to teach Artificial intelligence to all my students. Topics including AI, Mathematics, Science, Technology, I simplify these topics to help you understand how they work. Using…
Agentic AI #1
How to Create a flight agents
Do watch it, like and subscribe to our YouTube channel. More videos are coming
https://youtu.be/ivt5q_pIeE4?si=7P1Z-9LHlQKlhttO
How to Create a flight agents
Do watch it, like and subscribe to our YouTube channel. More videos are coming
https://youtu.be/ivt5q_pIeE4?si=7P1Z-9LHlQKlhttO
YouTube
How to create a Flight Agent using Phidata framework
This is the first video of agentic AI video series. In this video you will learn about how to create an AI flight Agent using Phidata Framework.
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof…
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof…
Forwarded from Placement preparation material
Media is too big
VIEW IN TELEGRAM
Final video reel - Made with Clipchamp.mp4
Agentic AI #2
How to create an AI agent using CrewAI
Do watch it, like and subscribe to our YouTube channel. More videos are coming
https://youtu.be/ikDJ56k0Y9U?si=UyUkuMXOTC8kPC0l
How to create an AI agent using CrewAI
Do watch it, like and subscribe to our YouTube channel. More videos are coming
https://youtu.be/ikDJ56k0Y9U?si=UyUkuMXOTC8kPC0l
YouTube
Learn how to create an AI Agent using CrewAI
In this video you will be learning about how to create an AI agent using CREWAI in a very simple manner.
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof
Connect with me On LinkedIn
https:…
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof
Connect with me On LinkedIn
https:…
Feedback received for the Generative AI training from the students from University of texas
Dm us on whatsapp for real time training +9183182 38637
Dm us on whatsapp for real time training +9183182 38637
How to create an End to End LLM application in aws
Do watch it, like and subscribe to our YouTube channel
https://youtu.be/-3Pk8KFPsB4?si=7l4An6RZGBZDowH5
Do watch it, like and subscribe to our YouTube channel
https://youtu.be/-3Pk8KFPsB4?si=7l4An6RZGBZDowH5
YouTube
End to End LLM Application using AWS Services
In this video we will learn about how to create an End to End LLM application using AWS Services like lambda, S3, cloudwatch, IAM and BedRock.
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof…
Follow us on Instagram
www.instagram.com/dataspoof
Join our telegram channel for study materials
www.tgoop.com/dataspoof…
AutomatedCleaning is a Python library for automated data cleaning. It helps preprocess and analyze datasets by handling missing values, outliers, spelling corrections, and more
Features
Supports both large (100+ GB) and small datasets
Detects and handles missing values and duplicate records
Identifies and corrects spelling errors in categorical values
Detect outliers
Detects and fixes data imbalance
Identifies and corrects skewness in numerical data
Checks for correlation and detects multicollinearity
Analyzes cardinality in categorical columns
Identifies and cleans text columns
Detect JSON-type columns
Performs univariate, bivariate, and multivariate analysis
https://lnkd.in/gmaStAsp
Features
Supports both large (100+ GB) and small datasets
Detects and handles missing values and duplicate records
Identifies and corrects spelling errors in categorical values
Detect outliers
Detects and fixes data imbalance
Identifies and corrects skewness in numerical data
Checks for correlation and detects multicollinearity
Analyzes cardinality in categorical columns
Identifies and cleans text columns
Detect JSON-type columns
Performs univariate, bivariate, and multivariate analysis
https://lnkd.in/gmaStAsp
Let's learn about Model Interpretability
Interpretability is essential for:
Model debugging - Why did my model make this mistake?
Feature Engineering - How can I improve my model
Detecting fairness issues - Does my model discriminate?
Human-AI cooperation - How can I understand and trust the model's decisions?
Regulatory compliance - Does my model satisfy legal requirements?
High-risk applications - Healthcare, finance, judicial,
Interpretability is essential for:
Model debugging - Why did my model make this mistake?
Feature Engineering - How can I improve my model
Detecting fairness issues - Does my model discriminate?
Human-AI cooperation - How can I understand and trust the model's decisions?
Regulatory compliance - Does my model satisfy legal requirements?
High-risk applications - Healthcare, finance, judicial,
Master Python programming in 30 days
https://www.udemy.com/course/master-python-programming-in-30-days-2025/?couponCode=7E26C0F137E5723E1E63
https://www.udemy.com/course/master-python-programming-in-30-days-2025/?couponCode=7E26C0F137E5723E1E63
Udemy
Master Python Programming in 30 days (2025)
Master Python by building real time projects. Learn data science, automation, build API, websites and databases.