Web scraping allows extracting data from websites programmatically. This is useful for gathering information like prices, inventory, reviews etc.
OpenAI provides an innovative approach to build robust web scrapers using natural language processing.
In this post, we will walk through a complete Scala code example that leverages OpenAI function calling to scrape product data from a sample ecommerce website.
Leveraging OpenAI Function Calling
OpenAI function calling provides a way to define schemas for the data you want extracted from a given input. When making an API request, you can specify a function name and parameters representing the expected output format.
OpenAI's natural language model will then analyze the provided input, extract relevant data from it, and return the extracted information structured according to the defined schema.
This pattern separates the raw data extraction capabilities of the AI model from your downstream data processing logic. Your code simply expects the data in a clean, structured format based on the function specification.
By leveraging OpenAI's natural language processing strengths for data extraction, you can create web scrapers that are resilient to changes in the underlying page structure and content. The business logic remains high-level and focused on data usage, while OpenAI handles the messy details of parsing and extracting information from complex HTML.
Why Use Function Calling
One key advantage of this web scraping technique is that the core scraper logic is immune to changes in the HTML structure of the target site. Since OpenAI is responsible for analyzing the raw HTML and extracting the desired data, the Scala code does not make any assumptions about HTML structure. The scraper will adapt as long as the sample HTML provided to OpenAI reflects the current page structure. This makes the scraper much more robust against site redesigns compared to scraping code that depends on specific HTML elements.
Overview
Here is an overview of the web scraping process we will implement:
- Send HTML representing the target page to OpenAI
- OpenAI analyzes the HTML and extracts the data we want
- OpenAI returns the extracted data structured as defined in our Scala function
- Process the extracted data in Scala as needed
This allows creating a scraper that adapts to changes in page layouts. The core logic stays high-level while OpenAI handles analyzing the raw HTML.
The Setup
To call the OpenAI API from Scala, you can use the openai-scala library:
libraryDependencies += "org.openai" %% "openai-scala" % "0.1.0"
This provides a Scala client for the OpenAI API.
Then in your code, you can create and initialize the OpenAI client:
import org.openai.Client
val client = Client("sk-...")
This will allow calling API methods like
So the key pieces needed for the Scala integration are:
With these dependencies configured, you can call the OpenAI API from Scala to implement the web scraping example using function calling.
Sample HTML
First, we need some sample HTML representing the page content we want to scrape:
<div class="products">
<div class="product">
<h3>Blue T-Shirt</h3>
<p>A comfortable blue t-shirt made from 100% cotton.</p>
<p>Price: $14.99</p>
</div>
<div class="product">
<h3>Noise Cancelling Headphones</h3>
<p>These wireless over-ear headphones provide active noise cancellation.</p>
<p>Price: $199.99</p>
</div>
<div class="product">
<h3>Leather Laptop Bag</h3>
<p>Room enough for up to a 15" laptop. Made from genuine leather.</p>
<p>Price: $49.99</p>
</div>
</div>
This contains 3 product listings, each with a title, description and price.
Sending HTML to OpenAI
Next, we need to send this sample HTML to the OpenAI API. The HTML is passed in the message content:
val message = Message(
Map("content" -> html)
)
This will allow OpenAI to analyze the HTML structure.
Defining Output Schema
We need to define the expected output schema so OpenAI knows what data to extract.
We'll define an
val function = Function(
"extractedData",
Map(
"type" -> "array",
"items" -> Map(
"type" -> "object",
"properties" -> Map(
"title" -> Map("type" -> "string"),
"description" -> Map("type" -> "string"),
"price" -> Map("type" -> "string")
)
)
)
)
This specifies we want an array of product objects, each with a title, description and price.
Calling OpenAI API
Now we can call the OpenAI API, passing the HTML and function:
val request = CompletionsRequest(
model = "text-davinci-003",
messages = List(message),
functions = List(function)
)
val response = client.createCompletion(request)
This will analyze the HTML and return extracted data matching the schema we defined.
Processing Extracted Data
Finally, we can process the extracted data in our Scala function:
case class Product(
title: String,
description: String,
price: String
)
def extractedData(products: List[Product]): Unit = {
println("Extracted Product Data:")
products.foreach { product =>
println(product.title)
println(product.description)
println(product.price)
}
}
This simply prints out each product's details. We could also save the data to a database etc.
So by leveraging OpenAI's function calling from Scala, we can build a robust web scraper that is resilient against changes in HTML structure. The core logic focuses on using the extracted data while OpenAI handles parsing the raw HTML.
Full Code Example
Here is the complete Scala code to scrape product data using OpenAI function calling:
import org.openai._
case class Product(
title: String,
description: String,
price: String
)
def extractedData(products: List[Product]): Unit = {
println("Extracted Product Data:")
products.foreach { product =>
println(product.title)
println(product.description)
println(product.price)
}
}
object Main extends App {
val client = Client("sk-...")
val html =
"""<div class="products">...</div>"""
val message = Message(
Map("content" -> html)
)
val function = Function(
"extractedData",
Map(
"type" -> "array",
"items" -> Map(
// ...
)
)
)
val request = CompletionsRequest(
model = "text-davinci-003",
messages = List(message),
functions = List(function)
)
val response = client.createCompletion(request)
val products = // parse response
extractedData(products)
}
Conclusion
Using OpenAI opens up an exciting new way to approach web scraping whih wasnt possible before
However, this approach also has some limitations:
A more robust solution is using a dedicated web scraping API like Proxies API
With Proxies API, you get:
With features like automatic IP rotation, user-agent rotation and CAPTCHA solving, Proxies API makes robust web scraping easy via a simple API:
curl "https://api.proxiesapi.com/?key=API_KEY&url=targetsite.com"
Get started now with 1000 free API calls to supercharge your web scraping!