"The Glacier App" is a conversational helper tool that allows users to generate personalized conversation starters, summaries, and interesting facts about a person based on publicly available data from LinkedIn and Twitter. It leverages AI-based natural language processing with LangChain
and APIs to gather data and present it in a user-friendly and engaging format.
This app can be particularly useful for networking professionals, recruiters, or anyone looking to make impactful initial impressions while engaging with someone new.
- Smart Data Aggregation: Scrapes relevant information from LinkedIn and Twitter using intelligent agents.
- AI-Powered Insights: Generates an appealing summary of the person, unique facts, conversation starters, and topics of interest using AI.
- User-Friendly Web Interface: A simple, responsive front-end interface to enter names and view results quickly.
- Dynamic Spinner: Loading spinner to enhance the user experience while processing requests.
- Profile Picture Integration: Automatically fetches and displays the LinkedIn profile picture, if available.
- Input: Enter the name of the person whose insights you want to generate on the form.
- Data Fetching:
- Fetch LinkedIn and Twitter profile details using
SerpAPIWrapper
and connected APIs. - Scrape key information such as work history, tweets, topics of interest, and profile images.
- Fetch LinkedIn and Twitter profile details using
- AI-Powered Processing: Use the power of the
LangChain
library to:- Analyze the fetched data.
- Generate conversational icebreakers and facts.
- Output: View a summary, facts, ice-breakers, and suggested topics of interest alongside the profile image, all displayed clearly in the web interface.
- Python 3.13.2 or higher
pip
for dependency management- API keys for:
- LinkedIn scraping
- Twitter API
- SerpAPI for Google Custom Search
- Clone the repository:
git clone <repo_url>
cd ice-breaker
- Install dependencies:
pipenv install
pipenv shell
- Set up your
.env
file with the required API keys:
SCRAPIN_API_KEY=<Your_LinkedIn_API_Key>
OPENAI_API_KEY=<Your_OpenAI_API_Key>
SCRAPIN_API_KEY=<Your_Scripin_API_Key>
TAVILY_API_KEY=<Your_Tavily_API_Key>
LANGSMITH_API_KEY=<Your_Langsmith_API_Key>
- Run the Flask application:
python app.py
- Open your browser and navigate to:
http://127.0.0.1:5000/
- app.py: Main application file to run the Flask server and route API calls.
- templates:
index.html
: The web interface used to input names and render the results.
- static/css:
style.css
: Contains styles for the front-end interface.
- third_parties:
linkedin.py
andtwitter.py
for data scraping from LinkedIn and Twitter respectively.
- agents: Lookup agents for fetching profile URLs based on name.
- glacier.py: Core business logic for creating conversational summaries and facts using
LangChain
. - output_parsers.py: Defines output schemas for AI-generated data.
- tools.py: Custom SerpAPI wrapper for profile URL searches.
- Description: Processes the name submitted via the web form and returns the generated data.
- Request:
{ "name": "Harrison Chase" }
- Response:
{
"summary": "Software Engineer focused on AI development.",
"interests": ["Python", "Natural Language Processing"],
"facts": ["Runs LangChain", "Has a passion for AI research"],
"ice_breakers": ["What inspired your interest in AI?", "Do you teach others about LangChain?"],
"picture_url": "https://linkedin.com/path/to/profile-pic.jpg"
}
Backend:
- Python 3.13.2
- Flask
- LangChain (for AI task orchestration)
- Tweepy (Twitter scraping)
Frontend:
- HTML/CSS
- FontAwesome (icons)
External APIs:
- ScrapIn (for linkedin profile lookups)
- Tavily API for scraping accessible data.