Twint - A Twitter scraper that rocks

Need to scrape up Twitter accounts and then make your little gruel and get the best out of it? This command-line tool will allow you to retrieve all tweets from any Twitter account without having to log in.


2 min read
Twint - A Twitter scraper that rocks

Need to scrape up Twitter accounts and then make your little gruel and get the best out of it?

No problem with Twint!

This command-line tool will allow you to retrieve all tweets from any Twitter account without having to log in. Without having to go through the Twitter API and without any limits.

To install Twint :

pip3 install twint

To achieve this feat, Twint uses the classic Twitter search and can thus retrieve all the tweets from an account.
It can even retrieve only those containing the keyword of your choice, or those posted between such and such a date, or from such and such a place, or only the retweets. It is also possible to retrieve all tweets from all Twitter accounts that mention the keyword of your choice. That's nice!

Enough for you to go into fine analysis, make stats, retrieve data, or keep a backup of the Twitter accounts that interest you.

Here are the commands you can use with Twint :

twint -u username - Scrape all the Tweets from user's timeline.

twint -u username -s pineapple - Scrape all Tweets from the user's timeline containing pineapple.

twint -s pineapple - Collect every Tweet containing pineapple from everyone's Tweets.

twint -u username --year 2014 - Collect Tweets that were tweeted before 2014.

twint -u username --since 2015-12-20 - Collect Tweets that were tweeted since 2015-12-20.

twint -u username -o file.txt - Scrape Tweets and save to file.txt.

twint -u username -o file.csv --csv - Scrape Tweets and save as a csv file.

twint -u username --email --phone - Show Tweets that might have phone numbers or email addresses.

twint -s "Donald Trump" --verified - Display Tweets by verified users that Tweeted about Donald Trump.

twint -g="48.880048,2.385939,1km" -o file.csv --csv - Scrape Tweets from a radius of 1km around a place in Paris and export them to a csv file.

twint -u username -es localhost:9200 - Output Tweets to Elasticsearch

twint -u username -o file.json --json - Scrape Tweets and save as a json file.

twint -u username --database tweets.db - Save Tweets to a SQLite database.

twint -u username --followers - Scrape a Twitter user's followers.

twint -u username --following - Scrape who a Twitter user follows.

twint -u username --favorites - Collect all the Tweets a user has favorited.

twint -u username --following --user-full - Collect full user information a person follows

twint -u username --profile-full - Use a slow, but effective method to gather Tweets from a user's profile (Gathers ~3200 Tweets, Including Retweets).

twint -u username --retweets - Use a quick method to gather the last 900 Tweets (that includes retweets) from a user's profile.

twint -u username --resume 10940389583058 - Resume a search starting 

Have fun :)

twintproject/twint
An advanced Twitter scraping & OSINT tool written in Python that doesn't use Twitter's API, allowing you to scrape a user's followers, following, Tweets and more while evading most ...

A quick introduction to Elastic Stack - ELK
Previous article

A quick introduction to Elastic Stack - ELK

Logs are essential elements for all applications to understand the functioning, analyze, diagnose, and intervene accordingly. One of the simple ways to have proactive management is to check the logs regularly. And react to information that indicates a degradation of the system or the application.

What is a smart contract?
Next article

What is a smart contract?

Smart contracts are one of the most promising applications in the Blockchain ecosystem. Unlike a traditional deal, the execution is guaranteed by a legislative framework, and the execution is governed by computer code.


Related Articles

GO TOP

🎉 You've successfully subscribed to Primates!
OK