LangChain/LangGraph

This guide explains how to use Integry with LangChain/LangGraph to post a message on Slack.

We can utilize LangChain agents with ChatAnthropicExperimental, ChatCohere, ChatFireworks, ChatMistralAI, and ChatOpenAI. However, at present, creating agents using Langchain with ChatGoogleGenerativeAI (Gemini) is not supported.

1. Install Required Libraries

First, you need to install the necessary packages:

Integry requires Python version 3.12 or higher

  • Integry is used to integrate structured tools and functions.

  • LangChain to interact with OpenAI models and integrate structured tools.

  • LangGraph to create the agent and use the tools.

  • langchain_openai to interact with OpenAI’s GPT models.

pip install integry langchain langgraph langchain_openai

2. Initialize Integry & LLM

Import the necessary Libraries

import os
from integry import Integry
from langchain_core.messages import SystemMessage, HumanMessage
from langchain_openai import ChatOpenAI
from langchain_core.tools import StructuredTool
from langgraph.prebuilt import create_react_agent

User-ID is a unique string identifier for a user in your app or agent. Function Calls and Integrations are associated to a user ID. It will be the email address you used during the signup process on Integry.

For example:

user_id = "joe@example.com"

Below code snippet initializes the Integry class to interact with the Integry API using the App-Key and App-Secret.

You can view and copy your App-Key and App-Secret from the Workspace Settings.

integry = Integry(
    app_key=os.environ.get("INTEGRY_APP_KEY"),
    app_secret=os.environ.get("INTEGRY_APP_SECRET"),
)

Now initializing the instance of the ChatOpenAI class to interact with OpenAI's GPT-4o model. You can get the API Key from the OpenAI Platform

llm = ChatOpenAI(
    model="gpt-4o",
    api_key=os.environ.get("OPENAI_API_KEY"),
)

3. Register the Integry Function as a Tool

Perfect! Before you can use the functions available in Integry, you need to add the app to Integry. Slack, however, is pre-added to Integry by default, so there’s no need to add it manually.

Now that we've set everything up, we will proceed to send a message in Slack using the Integry slack-post-message function. from Integry. You can copy the function ID from the dropdown.

For example

In this case the function ID is slack-post-message

After getting the function ID, we then registers it with the Langchain agents to enable the assistant to call the function.

  • Create the LangChain Tool: Convert the Integry function into a LangChain tool using get_langchain_tool.

  • Set Up the Agent: Create an agent with LangGraph that uses the tool and LLM to post messages to Slack.

slack_post_message = await integry.functions.get("slack-post-message", user_id)

tool = slack_post_message.get_langchain_tool(StructuredTool.from_function, user_id)

agent = create_react_agent(
    tools=[tool],
    model=llm,
)

4. Connect Your Slack Account

To allow the agent to send a message on Slack on your user's behalf, the user must connect their Slack account. To connect a Slack account against the provided user ID, execute the following snippet.

slack = await integry.apps.get("slack", user_id)
print(slack.login_url)

This will print a URL which can be opened in a web browser to connect Slack.

5. Execute Agent

This will execute the agent and send a Hello message to the Slack channel, if you want to be more specific to channel you can have a content like Say hello to my team on Slack in the #random channel.

await agent.ainvoke({
    "messages": [
        SystemMessage(content="You are a helpful assistant"),
        HumanMessage(content="Say hello to my team on slack"),
    ]
})

This will send the message to the slack channel. Here is reference image

You can verify the successful message delivery by checking the highlighted content in the response, which indicates that the message was successfully sent.

Sample Response

Last updated