Harnessing the Anthropic API with F#

Read Time: 19 minutes

Since LLMs are currently all the rage, I wanted to spend some time today to dig into how to leverage the Anthropic API using F#. Anthropic provides a language model that can be utilized for various natural language processing tasks, and it can be a handy tool to add to the application toolkit.

Although Anthropic provides some language SDKs, they do not have anything for .NET. Thankfully they provide a REST endpoint, so that’s the route I’ll take. One consideration is the API isn’t free, but api calls are pretty cheap. The target is their messages API, and their new v3 models: opus, sonnet, and haiku.

As always, there are a couple things to setup first. There are some record types to be setup for interaction with the api. Additionally there are a couple settings to configure. Changing this alters the responses. Regarding the json encoding for the REST calls, the api expects lower snake case.

Most of this is pretty basic, and its best to look at their documentation to get more details. I’ll call out that temperature is set 1.0, which is more “creative”. That’s exactly what I want in a programming assistant, so I’ll go with that. The system prompt provides some direction on how I want answers to come back. Prompt engineering is a whole thing, so I won’t go into details. But as an example, with no prompt “namesome German cities”, gives me a list of major cities as well as details about them. With the prompt below I’m more likely to get a plain comma-delimited list, or sometimes even a Python array. The key here is, you can hone the shape of the answers.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
[<CLIMutable>]
type Message = {
Role: string
Content: string
}

[<CLIMutable>]
type Request = {
Model: string
Messages: Message[]
[<JsonPropertyName("max_tokens")>]
MaxTokens: int
Temperature: float
System: string
}

[<CLIMutable>]
type Usage = {
[<JsonPropertyName("input_tokens")>]
InputTokens: int
[<JsonPropertyName("output_tokens")>]
OutputTokens: int
}

[<CLIMutable>]
type Content = {
Type: string
Text: string
}

[<CLIMutable>]
type Response = {
Id: string
Type: string
Role: string
Model: string
[<JsonPropertyName("stop_sequence")>]
StopSequence: string option
Usage: Usage
Content: Content[]
[<JsonPropertyName("stop_reason")>]
StopReason: string
}

let anthropicUrl = "https://api.anthropic.com/v1/messages"
let anthropicVersion = "2023-06-01"
let apiKey = Environment.GetEnvironmentVariable("ANTHROPIC_API_KEY")
let mutable model =
"claude-3-opus-20240229" // opus
//"claude-3-sonnet-20240229" // sonnet
//"claude-3-haiku-20240307" // haiku
let mutable temperature = 1.0 // 0.0 - 1.0
let mutable max_tokens = 1000
let mutable systemMessage = "You are a programming assistant"

let jsonSerializerOptions = JsonSerializerOptions(
PropertyNamingPolicy = JsonNamingPolicy.CamelCase)

At this point, the function can be written to interface with the api. There isn’t anything crazy in the function. It’s a basic REST call, just formatted as the Anthropic api requires. Once the function is together, it can be called with simple message. The response json is different whether the call was a success or failure so it is helpful to accomodate those situations.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
let sendMessage (message: string) = task {
use httpClient = new HttpClient()
httpClient.DefaultRequestHeaders.Add("x-api-key", apiKey)
httpClient.DefaultRequestHeaders.Add("anthropic-version", anthropicVersion)

let userMessage =
{ Role = "user"
Content = message }

let requestData =
{ Model = model
Messages = [| userMessage |]
MaxTokens = max_tokens
Temperature = temperature
System = systemMessage }

let requestJson = JsonSerializer.Serialize(requestData, jsonSerializerOptions)
let requestContent = new StringContent(requestJson, Encoding.UTF8, "application/json")

let! response = httpClient.PostAsync(anthropicUrl, requestContent)
let! responseJson = response.Content.ReadAsStringAsync()
let responseData = JsonSerializer.Deserialize<Response>(responseJson, jsonSerializerOptions)

if not (responseData.Content = null) then
// Success
if not (Array.isEmpty responseData.Content) then
let responseText =
responseData.Content
|> Array.filter (fun x -> x.Type = "text")
|> Array.map (fun x -> x.Text)
|> String.concat "\n"

return Some responseText
else
return None
else
// Error
let errorResponseData = JsonSerializer.Deserialize<ErrorResponse>(responseJson, jsonSerializerOptions)
printfn "Error processing request: %A" errorResponseData
return None
}

let message = "Write an F# function to factor a number"
let response =
sendMessage message
|> Async.AwaitTask
|> Async.RunSynchronously
printfn $"response: {response}"

So, what does an F# function that factors a number look like? The output below shows what Anthropic’s opus thinks at least. In addition to the textual result shown, the api response structure, includes other details, like input tokens (21) and output tokens (695). Depending on the application, these extras can be useful.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
response: Here's an F# function to factor a number:

``fsharp
let rec factorize n =
let rec findFactor i =
if i * i > n then
[n]
elif n % i = 0 then
i :: factorize (n / i)
else
findFactor (i + 1)

if n <= 1 then
[]
else
findFactor 2
``

This function uses a recursive approach to find the factors of a given number `n`. Here's how it works:

1. The main function `factorize` takes an integer `n` as input.
2. It first checks if `n` is less than or equal to 1. If so, it returns an empty list since numbers less than or equal to 1 have no factors.
3. If `n` is greater than 1, it calls the inner recursive function `findFactor` with an initial value of 2.
4. The `findFactor` function takes an integer `i` as input and does the following:
- If `i * i` is greater than `n`, it means that `n` is a prime number and cannot be factored further. In this case, it returns a list containing only `n`.
- If `n` is divisible by `i` (i.e., `n % i = 0`), it means that `i` is a factor of `n`. It adds `i` to the list of factors and recursively calls `factorize` with `n / i` to find the factors of the remaining part.
- If `n` is not divisible by `i`, it calls `findFactor` with `i + 1` to check the next potential factor.
5. The function continues recursively until all factors are found, and it returns the list of factors.

Here are a few examples of using the `factorize` function:

``fsharp
printfn "%A" (factorize 12) // Output: [2; 2; 3]
printfn "%A" (factorize 18) // Output: [2; 3; 3]
printfn "%A" (factorize 17) // Output: [17]
printfn "%A" (factorize 1) // Output: []
``

In the above examples:
- `factorize 12` returns `[2; 2; 3]` because 12 = 2 * 2 * 3.
- `factorize 18` returns `[2; 3; 3]` because 18 = 2 * 3 * 3.
- `factorize 17` returns `[17]` because 17 is a prime number.
- `factorize 1` returns an empty list because 1 has no factors.

Note that this function returns the factors in the order they are found, which may not necessarily be in ascending order.

One aspect of how LLMs work is they can leverage more than a single message, but an entire conversation history. This is where context comes from when you need to tweak something. Doing this is pretty easy. Here I make a modification to store user messages and response, then I feed this into the api. For a bit more detail, the main changes are: I append user messages and LLM responses to conversation history, I send the history to the api, and I add a REPL for the user to send a message, see the result, then provide follow-up information.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
/// Conversation history
let mutable history = []

let sendMessageWithHistory (message: string) = task {
use httpClient = new HttpClient()
httpClient.DefaultRequestHeaders.Add("x-api-key", apiKey)
httpClient.DefaultRequestHeaders.Add("anthropic-version", anthropicVersion)

let systemMessage = systemMessage
let userMessage =
{ Role = "user"
Content = message }

// Add user message to conversation
history <- List.append history [ userMessage ]

let requestData =
{ Model = model
Messages = List.toArray history
MaxTokens = max_tokens
Temperature = temperature
System = systemMessage }

let requestJson = JsonSerializer.Serialize(requestData, jsonSerializerOptions)
let requestContent = new StringContent(requestJson, Encoding.UTF8, "application/json")

let! response = httpClient.PostAsync(anthropicUrl, requestContent)
let! responseJson = response.Content.ReadAsStringAsync()
let responseData = JsonSerializer.Deserialize<Response>(responseJson, jsonSerializerOptions)

if not (responseData.Content = null) then
// Success
if not (Array.isEmpty responseData.Content) then
let responseText =
responseData.Content
|> Array.filter (fun x -> x.Type = "text")
|> Array.map (fun x -> x.Text)
|> String.concat "\n"

let assistantMessage =
{ Role = "assistant"
Content = responseText }

// Add response to conversation
history <- List.append history [ assistantMessage ]

return Some responseText
else
return None
else
// Error
let errorResponseData = JsonSerializer.Deserialize<ErrorResponse>(responseJson, jsonSerializerOptions)
printfn "Error processing request: %A" errorResponseData
return None
}

let rec loop() = task {
printf "> "
let input = Console.ReadLine()
match input with
| "quit" -> ()
| message ->
let! response = sendMessageWithHistory message
match response with
| Some r -> printfn $"{r}"
| None -> printfn "No content from api call"
return! loop()
}

loop()
|> Async.AwaitTask
|> Async.RunSynchronously

Here is what it looks the output from this looks like. You can see the ability to include conversation history provides a nice experience. If you’ve used any of the web interfaces for these LLMs, this is the experience you are probably used to.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
> write and F# function to split a paragraph into sentences
Here's an F# function that splits a paragraph into sentences:

``fsharp
open System
open System.Text.RegularExpressions

let splitIntoSentences (paragraph: string) =
let sentenceEnders = [| '.'; '!'; '?' |]
let mutable sentences = []
let mutable currentSentence = ""

for c in paragraph do
currentSentence <- currentSentence + string c
if Array.contains c sentenceEnders then
sentences <- sentences @ [currentSentence.Trim()]
currentSentence <- ""

if not (String.IsNullOrWhiteSpace(currentSentence)) then
sentences <- sentences @ [currentSentence.Trim()]

sentences

// Example usage
let paragraph = "This is the first sentence. This is the second sentence! And this is the third sentence? This is the fourth and final sentence."
let sentences = splitIntoSentences paragraph
printfn "%A" sentences
``

In this function:

1. We define an array `sentenceEnders` that contains the characters that typically end a sentence: period (`.`), exclamation mark (`!`), and question mark (`?`).

2. We initialize a mutable list `sentences` to store the individual sentences and a mutable string `currentSentence` to build each sentence.

3. We iterate over each character `c` in the input `paragraph`:
- We append the character to `currentSentence`.
- If the character is found in `sentenceEnders`, we consider it the end of a sentence:
- We add the `currentSentence` (after trimming any leading/trailing whitespace) to the `sentences` list.
- We reset `currentSentence` to an empty string to start building the next sentence.

4. After the loop, if `currentSentence` is not empty or whitespace, we add it to the `sentences` list as the last sentence.

5. Finally, we return the `sentences` list containing the individual sentences.

In the example usage, we have a paragraph string that consists of multiple sentences. We pass this paragraph to the `splitIntoSentences` function, which returns a list of individual sentences. We then print the list of sentences using `printfn "%A"`.

Output:
``
["This is the first sentence."; "This is the second sentence!"; "And this is the third sentence?"; "This is the fourth and final sentence."]
``

The function splits the paragraph into individual sentences based on the presence of sentence-ending punctuation marks (`.`, `!`, `?`). Note that this is a basic implementation and may not handle all edge cases or complex sentence structures perfectly.
> modify the function to return word count with each sentence
Sure! Here's the modified F# function that returns a tuple containing each sentence along with its word count:

``fsharp
open System
open System.Text.RegularExpressions

let splitIntoSentencesWithWordCount (paragraph: string) =
let sentenceEnders = [| '.'; '!'; '?' |]
let mutable sentences = []
let mutable currentSentence = ""

for c in paragraph do
currentSentence <- currentSentence + string c
if Array.contains c sentenceEnders then
let sentence = currentSentence.Trim()
let wordCount = sentence.Split([| ' '; '\t'; '\r'; '\n' |], StringSplitOptions.RemoveEmptyEntries).Length
sentences <- sentences @ [(sentence, wordCount)]
currentSentence <- ""

if not (String.IsNullOrWhiteSpace(currentSentence)) then
let sentence = currentSentence.Trim()
let wordCount = sentence.Split([| ' '; '\t'; '\r'; '\n' |], StringSplitOptions.RemoveEmptyEntries).Length
sentences <- sentences @ [(sentence, wordCount)]

sentences

// Example usage
let paragraph = "This is the first sentence. This is the second sentence! And this is the third sentence? This is the fourth and final sentence."
let sentencesWithWordCount = splitIntoSentencesWithWordCount paragraph
printfn "%A" sentencesWithWordCount
``

The modifications made to the function are as follows:

1. Instead of just storing the sentence in the `sentences` list, we now store a tuple `(sentence, wordCount)` for each sentence.

2. After extracting a sentence (trimming leading/trailing whitespace), we calculate its word count using the following steps:
- We split the sentence using the `Split` method with an array of delimiters including space, tab, carriage return, and newline characters.
- We use `StringSplitOptions.RemoveEmptyEntries` to remove any empty substrings resulting from consecutive delimiters.
- We count the length of the resulting array, which gives us the word count for the sentence.

3. We add the tuple `(sentence, wordCount)` to the `sentences` list.

4. After the loop, if `currentSentence` is not empty or whitespace, we process it as the last sentence, calculate its word count, and add it to the `sentences` list as a tuple.

5. Finally, we return the `sentences` list containing tuples of individual sentences and their corresponding word counts.

In the example usage, we have the same paragraph string as before. We pass this paragraph to the `splitIntoSentencesWithWordCount` function, which returns a list of tuples, where each tuple contains a sentence and its word count. We then print the list of sentence-word count tuples using `printfn "%A"`.

Output:
``
[("This is the first sentence.", 5); ("This is the second sentence!", 5); ("And this is the third sentence?", 6); ("This is the fourth and final sentence.", 7)]
``

The function splits the paragraph into individual sentences and returns each sentence along with its word count. Note that this implementation considers words as substrings separated by spaces, tabs, carriage returns, or newlines. It may not handle all edge cases or complex word structures perfectly.
> quit

In addition to sending text-based messages, the Anthropic API also supports sending image data as part of the conversation. Let’s explore how to handle image data using F#. For this I’ll need to modify some of the definitions to support image-related fields. I also modify the json serializer to exclude empty (None) values when serializing. The api fails when extra fields are sent, and this is a good way to handle that.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
[<CLIMutable>]
[<Struct>]
type ImageSource =
{ Type: string
[<JsonPropertyName("media_type")>]
MediaType: string
Data: string }

[<CLIMutable>]
type MessageContent =
{ Type: string
Source: ImageSource option
Text: string option
}

[<CLIMutable>]
type Message = {
Role: string
Content: MessageContent[]
}

[<CLIMutable>]
type Request = {
Model: string
Messages: Message[]
[<JsonPropertyName("max_tokens")>]
MaxTokens: int
Temperature: float
System: string option
}

jsonSerializerOptions.DefaultIgnoreCondition <- JsonIgnoreCondition.WhenWritingNull

When sending an image, it is similar to text, the primary different is sending a slightly different message structure. The image itself is sent as a Base64 encoded string. An additional note with images, you should be careful of the cost when sending images. There is a calcuation of image size to token, for this particular example, I use a 200x151 sized image to keep the token count down (The image here is bigger, just so its easier to see).

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
let sendImageMessage (imagePath: string) (message: string) = task {
use httpClient = new HttpClient()
httpClient.DefaultRequestHeaders.Add("x-api-key", apiKey)
httpClient.DefaultRequestHeaders.Add("anthropic-version", anthropicVersion)

let imageBytes = System.IO.File.ReadAllBytes(imagePath)
let encodedImage = Convert.ToBase64String(imageBytes)

let imageData =
{ Type = "image"
Source = Some
{ Type = "base64"
MediaType = "image/jpeg"
Data = encodedImage }
Text = None }

let messageData =
{ Type = "text"
Source = None
Text = Some message }

let userMessage =
{ Role = "user"
Content = [| imageData; messageData |] }

let requestData =
{ Model = model
Messages = [| userMessage |]
MaxTokens = maxTokens
Temperature = temperature
System = None }

let requestJson = JsonSerializer.Serialize(requestData, jsonSerializerOptions)
let requestContent = new StringContent(requestJson, Encoding.UTF8, "application/json")

let! response = httpClient.PostAsync(anthropicUrl, requestContent)
let! responseJson = response.Content.ReadAsStringAsync()
let responseData = JsonSerializer.Deserialize<Response>(responseJson, jsonSerializerOptions)

if not (responseData.Content = null) then
// Success
if not (Array.isEmpty responseData.Content) then
let responseText =
responseData.Content
|> Array.filter (fun x -> x.Type = "text")
|> Array.map (fun x -> x.Text)
|> String.concat "\n"

return Some responseText
else
return None
else
// Error
let errorResponseData = JsonSerializer.Deserialize<ErrorResponse>(responseJson, jsonSerializerOptions)
printfn "Error processing request: %A" errorResponseData
return None
}

let message = "What is in the image?"
let response =
sendImageMessage "test.jpg" message
|> Async.AwaitTask
|> Async.RunSynchronously
printfn $"response: {response}"

:Sample Image

Here is description of the image, not too bad.


response: The image shows a winter landscape covered in snow. It depicts a park or garden setting with bare trees and bushes blanketed in snow. In the foreground, there is a fence or railing, also covered in snow. The scene has a serene, tranquil feel with the monochromatic gray tones created by the snowy conditions. The trees appear slightly obscured or hazy, likely due to falling snow or overcast sky conditions during the winter snowfall captured in the photograph.

So, that’s about it for now. I’ve explored how to use the Anthropic API with F#, using a couple different scenarios. Hopefully this offers some inspiration to go out in play with this LLM or another one yourself. Until next time, happy coding with F# and Anthropic.