Examining Twitter Sentiment (The Easy Way)

If you came here to learn the intricacies of sentiment analysis, you came to the wrong post. This is a simple implementation of the TextBlob module using the Twitter API via Tweepy. We will store the tweets as well well as their sentiment within a created database via PyMySQL. The first step is to create an app in order gain access to the necessary tokens to access the Twitter API. You can do that here:

https://apps.twitter.com/

Once you have signed up (note you need to have Twitter as an app on your phone in order for this to work), navigate to the Keys and Access Tokens Tab within your application page. You will need to copy the consumer key and consumer secret, as well as access token and access token secret.

Once this you have copied the necessary keys, we can start to construct our script to pull tweets from the Twitter API. To start we can import the packages needed in order for this script to run. Since we will store these tweets and sentiment, we need to establish a connection object with our database. We will also create variables for all the necessary keys in order to access the Twitter API. For the heck of it, we will also put our topic name as a variable to pass in.

Next, we will create a class in order to stream tweets with Tweepy. For more information on this visit: http://docs.tweepy.org/en/v3.4.0/streaming_how_to.html

We can load the data returned in a JSON object and examine if there is any text in the tweet body. If not, we know that we will receive an error. We must handle this. If the tweet does contain text, we can extract all the necessary elements we need from the data. First, the tweet, then the username, and next, the data we will obtain from TextBlob. Sentiment will be reflected by polarity, while subjectivity will describe how subjective or objective the tweet is. Please note that this is a quick and dirty method that still will pale in comparison to using the NLTK library as well using advanced methods to identifying sentiment such as using an artificial neural network. We aren’t trying to reinvent the wheel here. Next, we can store the desired data in our database.

Next, we can pass the necessary tokens, as well as our topic and run our script.

About the author

programmingforfinance

Hi, I'm Frank. I have a passion for coding and extend it primarily within the realm of Finance.

View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *