Latest

Jul
30
Any Python Programmer Can Use State-of-the-art AI Models… Including You!

Any Python Programmer Can Use State-of-the-art AI Models… Including You!

You don't need a PhD to implement and train the best natural language processing models
6 min read
Jul
29
Gramformer: Correct Grammar With a Transformer Model

Gramformer: Correct Grammar With a Transformer Model

Learn how to correct grammar using a state-of-the-art NLP Transformer model called T5
3 min read
Jul
28
How to Label Text Classification Training Data -- With AI

How to Label Text Classification Training Data -- With AI

Label your training data with a zero-shot Transformer model. Then, use the labelled data to fine-tune a small supervised model
7 min read
Jul
26
Predict Future Events With Transformer Models (NLP Research Idea)

Predict Future Events With Transformer Models (NLP Research Idea)

If you can predict the ending of Harry Potter then why can't GPT-3?
3 min read
Jun
25
How to Summarize Text With Transformer Models

How to Summarize Text With Transformer Models

Learn how to implement advanced text summarization Transformer models with just a few lines of Python code
4 min read
Jun
23
Generating Text Classification Training Data With Zero-Shot Transformer Models

Generating Text Classification Training Data With Zero-Shot Transformer Models

So you don't have any labelled data for your text classification NLP project? Just use a zero-shot transformer model to generated data!
5 min read
Jun
14
Implement and Train Text Classification Transformer Models

Implement and Train Text Classification Transformer Models

Learn how to implement and train text classification Transformer models like BERT, DistilBERT and more with only a few lines of code
4 min read
Jun
11
How to Perform Sentiment Analysis With TextBlob

How to Perform Sentiment Analysis With TextBlob

Learn how to perform sentiment analysis with a Python package called TextBlob in only a few lines of code.
2 min read
Jun
11
GPT-J: a 6 Billion Parameter Open-Source GPT-3 Model

GPT-J: a 6 Billion Parameter Open-Source GPT-3 Model

EleutherAI just published a 6B parameter version of GPT-3 called GPT-J-6B. Try it out now!
2 min read
Jun
08

How to Generate Harry Potter Stories With GPT-2

Learn how to generate Harry Potter fan fiction stories with GPT-2 – a text generation Transformer model
3 min read