Web scraping allows extracting data from websites programmatically. This is useful for gathering information like prices, inventory, reviews etc.
OpenAI provides an innovative approach to build robust web scrapers using natural language processing.
In this post, we will walk through a complete Visual Basic code example that leverages OpenAI function calling to scrape product data from a sample ecommerce website.
Leveraging OpenAI Function Calling
OpenAI function calling provides a way to define schemas for the data you want extracted from a given input. When making an API request, you can specify a function name and parameters representing the expected output format.
OpenAI's natural language model will then analyze the provided input, extract relevant data from it, and return the extracted information structured according to the defined schema.
This pattern separates the raw data extraction capabilities of the AI model from your downstream data processing logic. Your code simply expects the data in a clean, structured format based on the function specification.
By leveraging OpenAI's natural language processing strengths for data extraction, you can create web scrapers that are resilient to changes in the underlying page structure and content. The business logic remains high-level and focused on data usage, while OpenAI handles the messy details of parsing and extracting information from complex HTML.
Why Use Function Calling
One key advantage of this web scraping technique is that the core scraper logic is immune to changes in the HTML structure of the target site. Since OpenAI is responsible for analyzing the raw HTML and extracting the desired data, the Visual Basic code does not make any assumptions about HTML structure. The scraper will adapt as long as the sample HTML provided to OpenAI reflects the current page structure. This makes the scraper much more robust against site redesigns compared to scraping code that depends on specific HTML elements.
Overview
Here is an overview of the web scraping process we will implement:
- Send HTML representing the target page to OpenAI
- OpenAI analyzes the HTML and extracts the data we want
- OpenAI returns the extracted data structured as defined in our VB function
- Process the extracted data in VB as needed
This allows creating a scraper that adapts to changes in page layouts. The core logic stays high-level while OpenAI handles analyzing the raw HTML.
The Setup
To call the OpenAI API from Visual Basic, you can use the OpenAI.VBNet library:
Imports OpenAI
This provides a VB.NET client for the OpenAI API.
Then in your code, you can create and initialize the OpenAI client:
Dim openai As New OpenAI("sk-...")
This will allow calling API methods like
So the key pieces needed for the VB.NET integration are:
With these dependencies configured, you can call the OpenAI API from VB.NET to implement the web scraping example using function calling.
Sample HTML
First, we need some sample HTML representing the page content we want to scrape:
<div class="products">
<div class="product">
<h3>Blue T-Shirt</h3>
<p>A comfortable blue t-shirt made from 100% cotton.</p>
<p>Price: $14.99</p>
</div>
<div class="product">
<h3>Noise Cancelling Headphones</h3>
<p>These wireless over-ear headphones provide active noise cancellation.</p>
<p>Price: $199.99</p>
</div>
<div class="product">
<h3>Leather Laptop Bag</h3>
<p>Room enough for up to a 15" laptop. Made from genuine leather.</p>
<p>Price: $49.99</p>
</div>
</div>
This contains 3 product listings, each with a title, description and price.
Sending HTML to OpenAI
Next, we need to send this sample HTML to the OpenAI API. The HTML is passed in the message content:
Dim message As New Message With {
.Content = html
}
This will allow OpenAI to analyze the HTML structure.
Defining Output Schema
We need to define the expected output schema so OpenAI knows what data to extract.
We'll define an
Dim function As New Function With {
.Name = "ExtractedData",
.Parameters = "{""type"": ""array"", ""items"": {""type"": ""object"", ""properties"": {""title"": {""type"": ""string""}, ""description"": {""type"": ""string""}, ""price"": {""type"": ""string""}}}}"
}
This specifies we want an array of product objects, each with a title, description and price.
Calling OpenAI API
Now we can call the OpenAI API, passing the HTML and function:
Dim request As New CompletionRequest With {
.Model = "text-davinci-003",
.Messages = {message},
.Functions = {function}
}
Dim response = openai.CreateCompletion(request)
This will analyze the HTML and return extracted data matching the schema we defined.
Processing Extracted Data
Finally, we can process the extracted data in our VB function:
Private Sub ExtractedData(products As List(Of Product))
Console.WriteLine("Extracted Product Data:")
For Each product In products
Console.WriteLine(product.Title)
Console.WriteLine(product.Description)
Console.WriteLine(product.Price)
Next
End Sub
This simply writes out each product's details. We could also save the data to a database etc.
So by leveraging OpenAI's function calling from VB.NET, we can build a robust web scraper that is resilient against changes in HTML structure. The core logic focuses on using the extracted data while OpenAI handles parsing the raw HTML.
Full Code Example
Here is the complete Visual Basic code to scrape product data using OpenAI function calling:
Imports System.Collections.Generic
Imports OpenAI
Public Class Product
Public Property Title As String
Public Property Description As String
Public Property Price As String
End Class
Private Sub ExtractedData(products As List(Of Product))
Console.WriteLine("Extracted Product Data:")
For Each product In products
Console.WriteLine(product.Title)
Console.WriteLine(product.Description)
Console.WriteLine(product.Price)
Next
End Sub
Public Sub Main()
Dim html As String = "<div>...</div>"
Dim openai As New OpenAI("sk-...")
Dim message As New Message With {.Content = html}
Dim function As New Function With {
.Name = "ExtractedData",
.Parameters = "{...}"
}
Dim request As New CompletionRequest With {
.Model = "text-davinci-003",
.Messages = {message},
.Functions = {function}
}
Dim response = openai.CreateCompletion(request)
Dim products As List(Of Product) = // parse response
ExtractedData(products)
End Sub
Conclusion
Using OpenAI opens up an exciting new way to approach web scraping whih wasnt possible before
However, this approach also has some limitations:
A more robust solution is using a dedicated web scraping API like Proxies API
With Proxies API, you get:
With features like automatic IP rotation, user-agent rotation and CAPTCHA solving, Proxies API makes robust web scraping easy via a simple API:
curl "https://api.proxiesapi.com/?key=API_KEY&url=targetsite.com"
Get started now with 1000 free API calls to supercharge your web scraping!