Data. You can't escape it. Possibly everything you do right now, and certainly everything you will do in the future will involve generating, using and manipulating data. The objective of this series is to get a taste for the power that Python affords when it comes to importing and handling data. The Pandas module is where the party is at, whether it's loading large .csv or .xlsx files from your computer or getting them straight from the web - Pandas has you covered. Combine this with the web scraping module 'Beautiful Soup', Matplotlib and also Pandas' own built in visualization functionality and you've got yourself a pretty serious data analysis tool kit! So let's see what the great Sam Ball has to say on this subject. Have fun folks.
If you've haven't been to a #HelloPython Hive yet then you may not have come across our introductory resources on the Numpy and Pandas modules. Here they are again, just for you. If you have been to #HelloPython already then talking a second look won't hurt (but you may wish to skip bits) and get straight to the good stuff in the subsequent lessons.
Pandas comes with some neat functions that are essential to know for importing data into Python. We will take a quick look at the two main file types you will receive data in - CSV and Excel Files, and how to import them as pandas data frames.
Sometimes data is stored online and is in a format that isn't so easy to convert to a file to download. Python has a number of packages that can read data from webpages, as well as some packages that can interact with APIs such as Google Finance to retrive data from the web. Here we will be looking at how to use modules pandas-datareader and beautiful soup to collect data from the web ready for analysis.