This exercise will require you to pull some data from the Qunadl API. Qaundl is currently the most widely used aggregator of financial market data.

Set API key

As a first step, you will need to register a free account on the http://www.quandl.com website.

After you register, you will be provided with a unique API key, that you should store:

In [1]:
# Store the API key as a string - according to PEP8, constants are always named in all upper case
API_KEY = 'rFZVQKZ-_3k2H2zMyDxk'

Qaundl has a large number of data sources, but, unfortunately, most of them require a Premium subscription. Still, there are also a good number of free datasets.

Imports

For this mini project, we will focus on equities data from the Frankfurt Stock Exhange (FSE), which is available for free. We'll try and analyze the stock prices of a company called Carl Zeiss Meditec, which manufactures tools for eye examinations, as well as medical lasers for laser eye surgery: https://www.zeiss.com/meditec/int/home.html. The company is listed under the stock ticker AFX_X.

You can find the detailed Quandl API instructions here: https://docs.quandl.com/docs/time-series

While there is a dedicated Python package for connecting to the Quandl API, we would prefer that you use the requests package, which can be easily downloaded using pip or conda. You can find the documentation for the package here: http://docs.python-requests.org/en/master/

Finally, apart from the requests package, you are encouraged to not use any third party Python packages, such as pandas, and

instead focus on what's available in the Python Standard Library (the collections module might come in handy: https://pymotw.com/3/collections/ ).

Also, since you won't have access to DataFrames, you are encouraged to use Python's native data structures - preferably dictionaries, though some questions can also be answered using lists.

You can read more on these data structures here: https://docs.python.org/3/tutorial/datastructures.html

Keep in mind that the JSON responses you will be getting from the API map almost one-to-one to Python's dictionaries. Unfortunately, they can be very nested, so make sure you read up on indexing dictionaries in the documentation provided above.

In [2]:
# First, import the relevant modules
import requests
import json
import collections

Note: API's can change a bit with each version, for this exercise it is reccomended to use the "V3" quandl api at https://www.quandl.com/api/v3/

Pull out the data via API

In [3]:
# 1,2 


# Now, call the Quandl API and pull out a small sample of the data (only one day) to get a glimpse
# into the JSON structure that will be returned

# url = 'https://www.quandl.com/api/v3/datasets/WIKI/FB/data.json?api_key=rFZVQKZ-_3k2H2zMyDxk&start_date=2017-01-01&end_date=2017-12-31'
url = "https://www.quandl.com/api/v3/datasets/FSE/AFX_X.json?"+ "&start_date=2017-01-01&end_date=2017-12-31&api_key=" + API_KEY


r = requests.get(url)

# Decode the JSON data into a dictionary: json_data
json_data = r.json()
In [4]:
# Inspect the JSON structure of the object you created, and take note of how nested it is,
# as well as the overall structure

type(json_data)
Out[4]:
dict

These are your tasks for this mini project:

  1. Collect data from the Franfurt Stock Exchange, for the ticker AFX_X, for the whole year 2017 (keep in mind that the date format is YYYY-MM-DD).
  2. Convert the returned JSON object into a Python dictionary.
  3. Calculate what the highest and lowest opening prices were for the stock in this period.
  4. What was the largest change in any one day (based on High and Low price)?
  5. What was the largest change between any two days (based on Closing Price)?
  6. What was the average daily trading volume during this year?
  7. (Optional) What was the median trading volume during this year. (Note: you may need to implement your own function for calculating the median.)
In [5]:
json_data['dataset'].keys()
Out[5]:
dict_keys(['id', 'dataset_code', 'database_code', 'name', 'description', 'refreshed_at', 'newest_available_date', 'oldest_available_date', 'column_names', 'frequency', 'type', 'premium', 'limit', 'transform', 'column_index', 'start_date', 'end_date', 'data', 'collapse', 'order', 'database_id'])
In [6]:
column_names = json_data['dataset']['column_names']
column_names
Out[6]:
['Date',
 'Open',
 'High',
 'Low',
 'Close',
 'Change',
 'Traded Volume',
 'Turnover',
 'Last Price of the Day',
 'Daily Traded Units',
 'Daily Turnover']
In [7]:
data = json_data['dataset']['data']

EDA

In [22]:
# 3 Calculate what the highest and lowest opening prices were for the stock in this period.
temp = [data[i][1] for i in range(len(data))]
open_prices = [_ for _ in temp if _ != None]
print(f'min = ${min(open_prices)}, max = ${max(open_prices)}')
min = $34.0, max = $53.11
In [23]:
# 4 What was the largest change in any one day (based on High and Low price)?
change = [abs(data[i][2]-data[i][3]) for i in range(len(data))]
print(f'${max(change):.2f}')
$2.81
In [27]:
# 5 What was the largest change between any two business days (based on Closing Price)?
CloseChange = [abs(data[i][4] - data[i+1][4]) for i in range(len(data)) if i < len(data)-1]
print(f'${max(CloseChange):.2f}')
$2.56
In [49]:
# 6 What was the average daily trading volume during this year?
print(f'{sum([data[i][6] for i in range(len(data))])/len(data):.1f}')
89124.3
In [53]:
# 7 (Optional) What was the median trading volume during this year. (Note: you may need to implement your own function for calculating the median.)
temp = [data[i][6] for i in range(len(data))]
volume = list(set([_ for _ in temp if _ != None]))
volume[int(len(volume)/2)]
Out[53]:
86753.0