For this project, I designed and implemented a MySQL database hosted on AWS to store data about US stocks. I also created a Python script to clean and transfer financial data from my broker into the MySQL database.
A complete description of the project can be found below. The Python script can be accessed in the project's GitHub repository.
My primary motivation for this project was to practice using the skills and tools listed below. Secondarily, I wanted to improve my stock trading system by storing my data in a relational database rather than in CSV files.
I started by specifying the requirements for my database.
The database must be able to:
I then created an ER diagram to help me design a database that meets the specified requirements. Due to the highly structured nature of the data, a relational database model was chosen over a non-relational model.
There are four entities (exchange, contract, price_volume, and fundamental) which correspond to the four tables in the database. Each stock contract has an associated exchange on which it is listed as well as price/volume and fundamental data that characterize the performance of the stock. The database is normalized so it can be easily modified to support additional contract types and performance metrics.
I chose MySQL as the relational database management system as it is open-source, widely used, and extensively documented.
Next, I created a MySQL database instance using Amazon's Relational Database Service (RDS), one of the many services available through the Amazon Web Services (AWS) platform. I took advantage of the RDS free tier, which provides 20GB of SSD storage for one year. The database could have been instantiated on my local computer instead of through RDS but I opted for RDS to get further acquainted with AWS and to have a simple solution for scaling up my database in the future.
To create the schema for my database, I connected the database to MySQL Workbench, the official MySQL client. I then defined the schema below based on my ER diagram.
Before writing the Python scripts to add data to the database, I specified the version control strategy that I would use for this project and my trading system as a whole. I chose to use Git as my distributed version control system because it is open-source, robust, and well documented. I also decided to push my local commits to GitHub as a way to remotely backup my code and facilitate collaboration in the future if the need arises.
When designing a workflow for my trading system, I was inspired by the widely used Gitflow workflow detailed by Vincent Driessen. Gitflow is well documented and effectively handles many of the issues that may arise in a software project, but it is too complex for this relatively simple project. The workflow I designed for this project utilizes the same "master", "develop", and "feature" branches that are specified in Gitflow but simplifies the branching and semantic versioning process. This simplified workflow is illustrated below.
As the rest of my trading system was developed in Python, I decided to use Python to query and insert data into my database. This allowed me to interface with my broker and the MySQL database via the same Python script.
I used Anaconda and its virtual environments to manage my Python packages, making sure to create a new virtual environment for every semantic version upgrade of my trading system. The MySQL Connector Python package was installed in my virtual environment to provide an interface with MySQL via Python.
I started by inserting data into the exchange table of the database. As my trading system focuses on only two exchanges, NASDAQ and NYSE, I performed this insertion manually using the code snippet below. The variables "cursor" and "cnx" refer to objects of the MySQL Connector module and the INSERT statement is standard SQL syntax combined with Python string formatting syntax. Refer to "data_acquisition.py" to see this code snippet in context.
insert_statement = ('INSERT INTO exchange '
'(abbrev, name, ib_name, state, city) '
'VALUES (%s, %s, %s, %s, %s)')
exchanges = [('NASDAQ', 'Nasdaq', 'ISLAND', 'New York', 'New York'),
('NYSE', 'New York Stock Exchange', 'NYSE', 'New York', 'New York')]
for exchange in exchanges:
self.cursor.execute(insert_statement, exchange)
self.cnx.commit()
I then used the data in the exchange table to populate the contract table. The data for the contracts was acquired from Interactive Brokers (IB) using their Python API. I followed the steps below, as seen in "data_acquisition.py."
Select the id and abbrev attributes from the exchange table using MySQL Connector.
For every exchange in the result set, use the IB API to scan for the 50 most actively traded stocks on that exchange priced between 10 and 20 USD.
Use the IB API to request the full contract details for every contract returned by the scan.
Parse the ContractDetail objects returned by the IB API to create a list of dictionaries, where each dictionary corresponds to a contract and the keys of each dictionary are the attributes of the contract table.
Use MySQL Connector to iteratively insert the data in each contract dictionary into the contract table of the MySQL database.
Next, I used the data in the contract table in conjunction with the IB API to populate the price_volume table. I followed the steps below, as seen in "data_acquisition.py."
Use MySQL Connector to select the attributes from the contract table that are necessary to define a contract in the IB API.
For every contract in the result set, use the IB API to get the historical 30min price/volume data for that contract.
Parse the BarData objects returned by the IB API to create a list of dictionaries, where each dictionary corresponds to a row in the price_volume table and the keys of each dictionary are the attributes of the price_volume table.
Use MySQL Connector to iteratively insert the data in each price_volume dictionary into the price_volume table of the MySQL database.
Finally, similar to my approach for the price_volume table, I used the data in the contract table in conjunction with the IB API to populate the fundamental table. I followed the steps below, as seen in "data_acquisition.py."
Use MySQL Connector to select the attributes from the contract table that are necessary to define a contract in the IB API.
For every contract in the result set, use the IB API to get the fundamental data for that contract.
Parse the string objects returned by the IB API to create a list of dictionaries, where each dictionary corresponds to a row in the fundamental table and the keys of each dictionary are the attributes of the fundamental table.
For every dictionary key, use a regular expression like the one below to extract the corresponding fundamental attribute's value from the string returned by IB. In the code snippet below, "re" is the built-in Python regular expression module, "value" is the string object returned by IB, and "MKTCAP" is the specified fundamental attribute.
re.search('MKTCAP=(.+?);', value)
If a fundamental attribute is not present in the string object, specify None
for the value of the corresponding dictionary key.
Use MySQL Connector to iteratively insert the data in each fundamental dictionary into the fundamental table of the MySQL database.
The functionality of the database can be extended by incorporating other data sources, accommodating additional contract types, and adding additional performance metrics such as the social sentiment towards a stock.