Gen AI for Quant Fin Python Modeling 101 Hands-on using BERT

Gen AI for Quant Fin Python Modeling 101 Hands-on using BERT, Python for Generative AI 101 for Beginners: Fine Tuning and connecting Chat feature to Logistic Regression backend.

Course Description

Python & Generative AI 101 for Beginners

In this course we do Hands-on Gen AI for Quant Fin Python Modeling 101. We often fix our code using ChatGPT which is similar to Copilot on Python but since we use notebook we use ChatGPT.

Python Generative AI for Modeling with ChatGpt & Copilot. GPT-Powered Chat Interface for querying rerunning tunning using manual config for Hugging is showed. This Fine Tuning and connecting Chat feature to Logistic Regression backend can be extended with better products like Open AI. We query an already saved logistic regression model with a GPT-powered chat interface to retrain and change features and other changes.

Course revolves around two projects:

  1. Fine-Tuning BERT with a Logistic Regression Layer
  2. Deploying Models for Real-Time Analytics: How to use tools like Flask/FastAPI to serve a text-based or data-generating model.

Topics Introduced

  1. Intro to BERT vs GPT
  2. Intro to Torch and Tensors
  3. Intro to FastAPI App
  4. In-memory Logistic Regression model
  5. Intro to  transformers like Trainer, TrainingArguments, BertTokenizer, BertForSequenceClassification

Intro to Gen AI in Finance

  1. Intro to BERT Models
  2. Hugging face pre trained models, online account, local training
  3. Intro to Fine tuning models, using the light DistilBERT, local training
  4. Using hugging face to map commands for backend to query simulated results, re-run simulation
  5. Connecting Logistic regression to front end of chat
  6. Limitation of hugging face as of today (Dec 2024
  7. Set up OpenAI API and hugging face

Future Work:

  1. Feature Selection Model Retraining on Real Data
  2. Intro to application of Gen AI in Data Analytics (synthesis anamoly and detection). Applications of Generative AI in Analytics: Data synthesis, anomaly detection, and predictive modeling.
  3. Using open to get all spectrum of instruction to query Model (we used 4-5 cases manual)

For Logsitic regression we understand what all can we do. Define the input schema for retraining. Define the training arguments with optimizations. We usse libraries like joblib or pickle to save and load the logistic regression model. Postman, cURL, or any client capable of sending HTTP POST requests

Future work: Other Enhancements such as adding input validation, feature tracking, hyperparameter validation, and returning probabilities for predictions.


Online Tutorials
Show full profile

Online Tutorials

Online Tutorials is a website sharing online courses, and online tutorials for free on a daily basis. You can find the best free online courses and thousands of free online courses with certificates to take your knowledge to the next level with the free courses.

We will be happy to hear your thoughts

Leave a reply

Online College Courses
Logo
Register New Account