general updates made, updated lots of stuff
This commit is contained in:
20130
data/HistoricalData.json
Normal file
20130
data/HistoricalData.json
Normal file
File diff suppressed because it is too large
Load Diff
478
src/MidasV1/README.md
Normal file
478
src/MidasV1/README.md
Normal file
@@ -0,0 +1,478 @@
|
|||||||
|
# MidasV1
|
||||||
|
|
||||||
|
 <!-- Replace with actual logo if available -->
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
|
||||||
|
- [Overview](#overview)
|
||||||
|
- [General Technical Overview](#general-technical-overview)
|
||||||
|
- [Workflow & Program Design](#workflow--program-design)
|
||||||
|
- [Module 1: Initial Checks](#module-1-initial-checks)
|
||||||
|
- [a. Operating System Check](#a-operating-system-check)
|
||||||
|
- [b. Dependency Check](#b-dependency-check)
|
||||||
|
- [c. Connectivity Check](#c-connectivity-check)
|
||||||
|
- [Module 2: IBJTS List Petitioner](#module-2-ibjts-list-petitioner)
|
||||||
|
- [a. Scanner](#a-scanner)
|
||||||
|
- [b. Refiner](#b-refiner)
|
||||||
|
- [Module 3: Stock Information Retrieval](#module-3-stock-information-retrieval)
|
||||||
|
- [a. Load](#a-load)
|
||||||
|
- [b. Threaded Information Gathering & Choosing Strategy](#b-threaded-information-gathering--choosing-strategy)
|
||||||
|
- [c. Strategy Implementation & Market Determination](#c-strategy-implementation--market-determination)
|
||||||
|
- [Module 4: Option Chain Trading & Risk Management](#module-4-option-chain-trading--risk-management)
|
||||||
|
- [a. Option Chain Data](#a-option-chain-data)
|
||||||
|
- [b. Risk Management Stage 1](#b-risk-management-stage-1)
|
||||||
|
- [c. Buying and Selling / Risk Management Stage 2](#c-buying-and-selling--risk-management-stage-2)
|
||||||
|
- [General Additions](#general-additions)
|
||||||
|
- [File Structure](#file-structure)
|
||||||
|
- [Installation](#installation)
|
||||||
|
- [Configuration](#configuration)
|
||||||
|
- [Usage](#usage)
|
||||||
|
- [Logging](#logging)
|
||||||
|
- [Future Enhancements](#future-enhancements)
|
||||||
|
- [Disclaimer](#disclaimer)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
**MidasV1** is a sophisticated trading bot developed in Python, designed to interact seamlessly with the Interactive Brokers (IB) Gateway or the IBJTS API and JTS. Leveraging a modular architecture, MidasV1 performs comprehensive system checks, retrieves and refines stock data, executes trading strategies based on real-time market analysis, and manages risks effectively.
|
||||||
|
|
||||||
|
This README provides an in-depth overview of MidasV1's architecture, functionalities, and setup instructions.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## General Technical Overview
|
||||||
|
|
||||||
|
- **Programming Language:** Python
|
||||||
|
- **Dependencies:**
|
||||||
|
- `ibapi`: Interactive Brokers API for Python
|
||||||
|
- `psutil`: For system resource monitoring
|
||||||
|
- **Requirements:**
|
||||||
|
- **IBJTS API & JTS:** Ensure that the IBJTS API and JTS are running.
|
||||||
|
- **IB Gateway:** Alternatively, IB Gateway can be used for connectivity.
|
||||||
|
- **Architecture:** Highly modular, facilitating scalability and maintainability.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Workflow & Program Design
|
||||||
|
|
||||||
|
MidasV1 is structured into multiple modules, each responsible for distinct functionalities, ensuring a clear separation of concerns and streamlined operations.
|
||||||
|
|
||||||
|
### Module 1: Initial Checks
|
||||||
|
|
||||||
|
#### a. Operating System Check
|
||||||
|
|
||||||
|
- **Purpose:** Determine the operating system of the device running MidasV1.
|
||||||
|
- **Supported OS:** Linux
|
||||||
|
- **Unsupported OS:** Windows, MacOS, BSD, illumos, etc.
|
||||||
|
- **Behavior:**
|
||||||
|
- **Linux:** Continues with execution, providing a green success message.
|
||||||
|
- **Unsupported OSes:** Displays red error messages and yellow warnings indicating future support considerations, then gracefully exits.
|
||||||
|
|
||||||
|
#### b. Dependency Check
|
||||||
|
|
||||||
|
- **Purpose:** Ensure all necessary Python packages are installed.
|
||||||
|
- **Mechanism:**
|
||||||
|
- **Source:** Reads dependencies from `requirements.txt` located in the parent directory of the `modules` folder.
|
||||||
|
- **Process:**
|
||||||
|
- Parses `requirements.txt` to extract package names.
|
||||||
|
- Checks if each package is installed using `pkg_resources`.
|
||||||
|
- **Missing Dependencies:** Informs the user with red text and provides the exact `pip` command to install them.
|
||||||
|
- **All Dependencies Present:** Confirms with a green success message.
|
||||||
|
|
||||||
|
#### c. Connectivity Check
|
||||||
|
|
||||||
|
- **Purpose:** Verify a secure and functional connection with the IB Gateway.
|
||||||
|
- **Configuration:**
|
||||||
|
- **Source:** Retrieves `host`, `port`, and `client_id` from `config/config.config` with default fallbacks (`127.0.0.1`, `4002`, `0` respectively).
|
||||||
|
- **Behavior:**
|
||||||
|
- Attempts to establish a connection to the IB Gateway.
|
||||||
|
- **Success:** Displays green messages confirming connection and successful retrieval of account summaries.
|
||||||
|
- **Failure:** Shows red error messages and yellow warnings, then exits gracefully.
|
||||||
|
- **Enhancements:**
|
||||||
|
- Utilizes colored console outputs for clear and human-readable reporting.
|
||||||
|
- Addresses the unused `advancedOrderRejectJson` parameter to eliminate warnings.
|
||||||
|
|
||||||
|
### Module 2: IBJTS List Petitioner
|
||||||
|
|
||||||
|
#### a. Scanner
|
||||||
|
|
||||||
|
- **Purpose:** Retrieve a list of stocks that meet predefined criteria.
|
||||||
|
- **Configuration:** Loads criteria such as search volume, net change, and percent change from `config/config.config`.
|
||||||
|
- **Process:**
|
||||||
|
- Initiates a scanner subscription via the IB API using the loaded criteria.
|
||||||
|
- Requests a list of stocks that satisfy the specified metrics.
|
||||||
|
- Caches the retrieved list temporarily for further processing.
|
||||||
|
|
||||||
|
#### b. Refiner
|
||||||
|
|
||||||
|
- **Purpose:** Further refine the scanned stock list based on additional criteria.
|
||||||
|
- **Criteria:**
|
||||||
|
1. **Share Price:** Exclude stocks with share prices exceeding a threshold defined in the config.
|
||||||
|
2. **Option Contracts:** Remove stocks without available option contracts.
|
||||||
|
3. **Volatility Index:** Exclude stocks with a volatility index above the configured threshold.
|
||||||
|
4. **Conditional Truncation:** If enabled, truncate the list to a maximum size specified in the config.
|
||||||
|
- **Behavior:**
|
||||||
|
- Applies each refinement step sequentially.
|
||||||
|
- Provides colored console outputs indicating inclusion or exclusion of stocks.
|
||||||
|
- Caches the refined list for transfer to subsequent modules.
|
||||||
|
|
||||||
|
### Module 3: Stock Information Retrieval
|
||||||
|
|
||||||
|
#### a. Load
|
||||||
|
|
||||||
|
- **Purpose:** Load the refined stock list from Module 2.
|
||||||
|
- **Behavior:** Ensures the availability of the refined list and prepares it for data retrieval.
|
||||||
|
|
||||||
|
#### b. Threaded Information Gathering & Choosing Strategy
|
||||||
|
|
||||||
|
- **Purpose:** Gather real-time market data for each stock and determine the optimal trading strategy.
|
||||||
|
- **Process:**
|
||||||
|
- **Threading:** Spawns individual threads for each stock to fetch data asynchronously.
|
||||||
|
- **Data Retrieved:** Datetime, high, low, close, and volume at specified trading intervals.
|
||||||
|
- **Data Storage:** Saves data in JSON files named `{stock_name}.{current_date}.json` within the `data/` directory.
|
||||||
|
- **Strategy Counter:** Maintains a counter based on incoming data to determine trend indicators.
|
||||||
|
|
||||||
|
#### c. Strategy Implementation & Market Determination
|
||||||
|
|
||||||
|
- **Purpose:** Analyze collected data to identify bullish or bearish trends.
|
||||||
|
- **Indicators:**
|
||||||
|
- **RSI (Relative Strength Index)**
|
||||||
|
- **MACD (Moving Average Convergence Divergence)**
|
||||||
|
- **ADX (Average Directional Index)**
|
||||||
|
- **EMA (Exponential Moving Average)**
|
||||||
|
- **Behavior:**
|
||||||
|
- Calculates or retrieves indicator values.
|
||||||
|
- Assigns weights to each indicator based on predefined thresholds.
|
||||||
|
- Determines overall market sentiment (bullish/bearish) for each stock.
|
||||||
|
- Based on internal boolean flags, decides whether to process the entire list or isolate the most bullish and bearish stocks for further actions.
|
||||||
|
|
||||||
|
### Module 4: Option Chain Trading & Risk Management
|
||||||
|
|
||||||
|
#### a. Option Chain Data
|
||||||
|
|
||||||
|
- **Purpose:** Retrieve and analyze option chain data for selected stocks.
|
||||||
|
- **Process:**
|
||||||
|
- **Data Retrieval:** Fetches option contracts closest to the current share price.
|
||||||
|
- **Filtering:**
|
||||||
|
- **Bearish Stocks:** Isolates contracts with strike prices slightly above the share price.
|
||||||
|
- **Bullish Stocks:** Isolates contracts with strike prices slightly below the share price.
|
||||||
|
- **Behavior:** Ensures that contracts are selected based on proximity to the current market price and other configurable parameters.
|
||||||
|
|
||||||
|
#### b. Risk Management Stage 1
|
||||||
|
|
||||||
|
- **Purpose:** Assess the acceptability of risk before executing trades.
|
||||||
|
- **Process:**
|
||||||
|
- Retrieves user account balance information.
|
||||||
|
- Determines if the cost of option contracts is within the acceptable risk percentage defined in the config.
|
||||||
|
- **Outcome:** Only proceeds with contracts that meet the risk criteria.
|
||||||
|
|
||||||
|
#### c. Buying and Selling / Risk Management Stage 2
|
||||||
|
|
||||||
|
- **Purpose:** Execute trades and manage ongoing risk.
|
||||||
|
- **Process:**
|
||||||
|
- **Trade Execution:** Buys option contracts that passed risk assessments.
|
||||||
|
- **Stop-Loss Orders:** Sets up stop-loss contracts based on configurable loss thresholds.
|
||||||
|
- **Continuous Monitoring:** Gathers real-time data to implement selling strategies, ensuring optimal trade exits.
|
||||||
|
|
||||||
|
### General Additions
|
||||||
|
|
||||||
|
- **Command-Line Flags:**
|
||||||
|
- `--no-checks`: Runs the program without prompting for user confirmation after initial checks.
|
||||||
|
- `--skip-checks`: Skips specific initial checks (primarily dependency checks).
|
||||||
|
- `--verbose`: Enables verbose and colorful output to the console.
|
||||||
|
- `--version`: Prints the program version and exits.
|
||||||
|
- **Logging & Console Outputs:**
|
||||||
|
- Implements both logging to files and colored console outputs.
|
||||||
|
- Controlled via the `--verbose` flag to manage verbosity levels.
|
||||||
|
- **Graceful Shutdowns:**
|
||||||
|
- Handles interrupt signals (e.g., Ctrl+C) to ensure connections are closed properly.
|
||||||
|
- **Extensibility:**
|
||||||
|
- Designed to determine the number of threads based on system resources for optimal performance.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## File Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
MidasV1/
|
||||||
|
├── README.md
|
||||||
|
├── requirements.txt
|
||||||
|
├── config/
|
||||||
|
│ └── config.config
|
||||||
|
├── main.py
|
||||||
|
├── modules/
|
||||||
|
│ ├── initial_checks.py
|
||||||
|
│ └── stock_list_petitioner.py
|
||||||
|
├── tests/
|
||||||
|
│ ├── test_data_retriever.py
|
||||||
|
│ ├── test_stock_retriever.py
|
||||||
|
│ └── test_connection.py
|
||||||
|
├── logs/
|
||||||
|
│ └── MidasV1.log
|
||||||
|
└── data/
|
||||||
|
└── {stock_name}.{current_date}.json
|
||||||
|
```
|
||||||
|
|
||||||
|
- **README.md:** This documentation file.
|
||||||
|
- **requirements.txt:** Lists all Python dependencies required by MidasV1.
|
||||||
|
- **config/config.config:** Configuration file containing all necessary parameters and thresholds.
|
||||||
|
- **main.py:** The primary script that orchestrates the application's flow.
|
||||||
|
- **modules/:** Contains all modular components of MidasV1.
|
||||||
|
- **initial_checks.py:** Performs system and environment checks.
|
||||||
|
- **stock_list_petitioner.py:** Retrieves and refines stock lists based on criteria.
|
||||||
|
- **tests/:** Contains test scripts for various modules.
|
||||||
|
- **test_data_retriever.py:** Tests data retrieval functionalities.
|
||||||
|
- **test_stock_retriever.py:** Tests stock retrieval and filtering.
|
||||||
|
- **test_connection.py:** Tests connectivity with IB Gateway.
|
||||||
|
- **logs/MidasV1.log:** Logs detailed execution information.
|
||||||
|
- **data/:** Stores JSON files with raw market data for each stock.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- **Python 3.6 or higher:** Ensure Python is installed on your system. You can verify the installation by running:
|
||||||
|
```bash
|
||||||
|
python --version
|
||||||
|
```
|
||||||
|
|
||||||
|
- **Interactive Brokers (IB) Account:** Required to access the IB Gateway or IBJTS API.
|
||||||
|
|
||||||
|
### Steps
|
||||||
|
|
||||||
|
1. **Clone the Repository:**
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/yourusername/MidasV1.git
|
||||||
|
cd MidasV1
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Set Up Virtual Environment (Recommended):**
|
||||||
|
```bash
|
||||||
|
python -m venv venv
|
||||||
|
source venv/bin/activate # On Windows: venv\Scripts\activate
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Install Dependencies:**
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Download IB API:**
|
||||||
|
- **From Interactive Brokers:**
|
||||||
|
- Visit the [Interactive Brokers API](https://www.interactivebrokers.com/en/index.php?f=5041) download page.
|
||||||
|
- Download and install the latest version of the IB API.
|
||||||
|
- **IB Gateway:**
|
||||||
|
- Alternatively, install the [IB Gateway](https://www.interactivebrokers.com/en/index.php?f=16457) for a lightweight connection.
|
||||||
|
|
||||||
|
5. **Configure IB Gateway:**
|
||||||
|
- Launch IB Gateway and log in using your IB credentials.
|
||||||
|
- Ensure that the API settings allow connections from your machine:
|
||||||
|
- **Enable API:** Navigate to `Configure` > `Settings` > `API` > `Settings`.
|
||||||
|
- **Trusted IPs:** Add `127.0.0.1` or your specific IP address.
|
||||||
|
- **Port:** Ensure it matches the `port` specified in `config/config.config` (default is `4002`).
|
||||||
|
|
||||||
|
6. **Verify Configuration:**
|
||||||
|
- Ensure `config/config.config` is properly set with your desired parameters.
|
||||||
|
- Example configuration is provided below.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
All configurable parameters are stored in `config/config.config`. Below is an example of the configuration file:
|
||||||
|
|
||||||
|
```ini
|
||||||
|
[General]
|
||||||
|
version = 1.0.0
|
||||||
|
# Future general configurations can be added here
|
||||||
|
|
||||||
|
[Connectivity]
|
||||||
|
host = 127.0.0.1
|
||||||
|
port = 4002
|
||||||
|
client_id = 0
|
||||||
|
# Add more connectivity parameters as needed
|
||||||
|
|
||||||
|
[SystemResources]
|
||||||
|
# Placeholder for system resource related configurations
|
||||||
|
# Example:
|
||||||
|
# max_cpu_threads = 8
|
||||||
|
# min_available_ram_gb = 4
|
||||||
|
|
||||||
|
[Logging]
|
||||||
|
level = INFO
|
||||||
|
# Available levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||||
|
# Set to DEBUG for verbose logging, INFO for standard logging, etc.
|
||||||
|
|
||||||
|
[Module2]
|
||||||
|
default_search_volume = 1000000
|
||||||
|
default_net_change = 0.50
|
||||||
|
default_percent_change = 2.0
|
||||||
|
default_refinement_share_price = 15.0
|
||||||
|
default_volatility_threshold = 30.0
|
||||||
|
conditional_refinement_enabled = True
|
||||||
|
max_refined_list_size = 100
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration Sections
|
||||||
|
|
||||||
|
- **[General]:** General settings, including the version of MidasV1.
|
||||||
|
|
||||||
|
- **[Connectivity]:**
|
||||||
|
- **host:** IP address of the IB Gateway (default `127.0.0.1`).
|
||||||
|
- **port:** Port number for the API connection (default `4002` for IB Gateway Simulated Trading).
|
||||||
|
- **client_id:** Unique client ID for the API connection.
|
||||||
|
|
||||||
|
- **[SystemResources]:**
|
||||||
|
- Placeholder for future configurations related to system resources.
|
||||||
|
- Example parameters for thread management can be added here.
|
||||||
|
|
||||||
|
- **[Logging]:**
|
||||||
|
- **level:** Logging verbosity (`DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`).
|
||||||
|
|
||||||
|
- **[Module2]:**
|
||||||
|
- **default_search_volume:** Minimum trading volume for stock selection.
|
||||||
|
- **default_net_change:** Minimum net change in stock price.
|
||||||
|
- **default_percent_change:** Minimum percentage change in stock price.
|
||||||
|
- **default_refinement_share_price:** Maximum share price threshold.
|
||||||
|
- **default_volatility_threshold:** Maximum acceptable volatility index.
|
||||||
|
- **conditional_refinement_enabled:** Boolean to enable or disable list truncation.
|
||||||
|
- **max_refined_list_size:** Maximum number of stocks in the refined list.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Running the Application
|
||||||
|
|
||||||
|
Navigate to the project root directory and execute `main.py` with desired flags:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
python main.py [--no-checks] [--skip-checks] [--verbose] [--version]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Available Flags
|
||||||
|
|
||||||
|
- `--no-checks`: Run the program without prompting for user confirmation after initial checks.
|
||||||
|
- `--skip-checks`: Skip specific initial checks (primarily dependency checks).
|
||||||
|
- `--verbose`: Enable verbose and colorful output to the console.
|
||||||
|
- `--version`: Print the program version and exit.
|
||||||
|
|
||||||
|
### Example Commands
|
||||||
|
|
||||||
|
1. **Standard Execution:**
|
||||||
|
```bash
|
||||||
|
python main.py
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Verbose Mode:**
|
||||||
|
```bash
|
||||||
|
python main.py --verbose
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Skip Dependency Checks:**
|
||||||
|
```bash
|
||||||
|
python main.py --skip-checks
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Run Without User Confirmation:**
|
||||||
|
```bash
|
||||||
|
python main.py --no-checks
|
||||||
|
```
|
||||||
|
|
||||||
|
5. **Display Version:**
|
||||||
|
```bash
|
||||||
|
python main.py --version
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing Modules
|
||||||
|
|
||||||
|
MidasV1 includes several test scripts located in the `tests/` directory to verify the functionality of individual modules.
|
||||||
|
|
||||||
|
1. **Test Stock Retriever:**
|
||||||
|
```bash
|
||||||
|
python tests/test_stock_retriever.py
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Test Connection:**
|
||||||
|
```bash
|
||||||
|
python tests/test_connection.py
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Test Data Retrieval:**
|
||||||
|
```bash
|
||||||
|
python tests/test_data_retriever.py
|
||||||
|
```
|
||||||
|
|
||||||
|
*Note: Ensure that the IB Gateway or IBJTS API is running before executing test scripts.*
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Logging
|
||||||
|
|
||||||
|
MidasV1 utilizes both file-based logging and console outputs to track its operations.
|
||||||
|
|
||||||
|
- **Log File:** `logs/MidasV1.log`
|
||||||
|
- **Location:** Stored in the `logs/` directory within the project root.
|
||||||
|
- **Content:** Detailed logs including debug information, errors, and informational messages.
|
||||||
|
- **Configuration:** Controlled via the `[Logging]` section in `config/config.config`.
|
||||||
|
|
||||||
|
- **Console Outputs:**
|
||||||
|
- **Color-Coded Messages:** Enhances readability with green for successes, red for errors, yellow for warnings, and blue/magenta for informational and decorative messages.
|
||||||
|
- **Verbosity:** Managed via the `--verbose` flag and the logging level set in the configuration file.
|
||||||
|
|
||||||
|
*Ensure that the `logs/` directory exists or is created before running the application to prevent logging errors.*
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Future Enhancements
|
||||||
|
|
||||||
|
MidasV1 is designed with scalability in mind, allowing for future feature additions and optimizations.
|
||||||
|
|
||||||
|
1. **Operating System Support:**
|
||||||
|
- Extend support to Windows, MacOS, BSD, illumos, etc., with specific handling mechanisms.
|
||||||
|
|
||||||
|
2. **Advanced Dependency Management:**
|
||||||
|
- Implement dynamic dependency resolution and version management.
|
||||||
|
|
||||||
|
3. **Enhanced Strategy Module:**
|
||||||
|
- Develop more sophisticated trading strategies based on additional market indicators.
|
||||||
|
- Incorporate machine learning algorithms for predictive analysis.
|
||||||
|
|
||||||
|
4. **Risk Management Enhancements:**
|
||||||
|
- Implement multi-stage risk assessments.
|
||||||
|
- Integrate portfolio diversification strategies.
|
||||||
|
|
||||||
|
5. **Performance Optimization:**
|
||||||
|
- Utilize system resource checks to dynamically allocate threads for optimal performance.
|
||||||
|
- Implement rate limiting and efficient data handling mechanisms.
|
||||||
|
|
||||||
|
6. **User Interface:**
|
||||||
|
- Develop a graphical user interface (GUI) for easier interaction and monitoring.
|
||||||
|
- Provide real-time dashboards for tracking trades and system status.
|
||||||
|
|
||||||
|
7. **Extensive Testing:**
|
||||||
|
- Expand test coverage to include integration and stress tests.
|
||||||
|
- Implement continuous integration (CI) pipelines for automated testing.
|
||||||
|
|
||||||
|
8. **Documentation & Support:**
|
||||||
|
- Enhance documentation with tutorials and usage guides.
|
||||||
|
- Provide support mechanisms for troubleshooting and user assistance.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Disclaimer
|
||||||
|
|
||||||
|
**MidasV1** is proprietary software developed for private use. Unauthorized distribution, replication, or modification is strictly prohibited. The author assumes no responsibility for any misuse or damages resulting from the use of this software. Users are advised to thoroughly test the application in a controlled environment (e.g., paper trading) before deploying it in live trading scenarios.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Additional Information
|
||||||
|
|
||||||
|
- **Contact:** For inquiries or support, please contact [kleinpanic@gmail.com](mailto:kleinpanic@gmail.com).
|
||||||
|
- **License:** All rights reserved. No part of this software may be reproduced or transmitted in any form or by any means without the prior written permission of the author.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
32
src/MidasV1/config/config.config
Normal file
32
src/MidasV1/config/config.config
Normal file
@@ -0,0 +1,32 @@
|
|||||||
|
# config/config.config
|
||||||
|
|
||||||
|
[General]
|
||||||
|
version = 1.0.0
|
||||||
|
# Future general configurations can be added here
|
||||||
|
|
||||||
|
[Connectivity]
|
||||||
|
host = 127.0.0.1
|
||||||
|
port = 4002
|
||||||
|
client_id = 0
|
||||||
|
# Add more connectivity parameters as needed
|
||||||
|
|
||||||
|
[SystemResources]
|
||||||
|
# Placeholder for system resource related configurations
|
||||||
|
# Example:
|
||||||
|
# max_cpu_threads = 8
|
||||||
|
# min_available_ram_gb = 4
|
||||||
|
|
||||||
|
[Logging]
|
||||||
|
level = INFO
|
||||||
|
# Available levels: DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||||
|
# Set to DEBUG for verbose logging, INFO for standard logging, etc.
|
||||||
|
|
||||||
|
[Module2]
|
||||||
|
default_search_volume = 1000000
|
||||||
|
default_net_change = 0.50
|
||||||
|
default_percent_change = 2.0
|
||||||
|
default_refinement_share_price = 15.0
|
||||||
|
default_volatility_threshold = 30.0
|
||||||
|
conditional_refinement_enabled = True
|
||||||
|
max_refined_list_size = 100
|
||||||
|
|
||||||
164
src/MidasV1/logs/MidasV1.log
Normal file
164
src/MidasV1/logs/MidasV1.log
Normal file
@@ -0,0 +1,164 @@
|
|||||||
|
2024-12-13 01:25:45,907 - MidasV1.Main - INFO - Starting MidasV1 Trading Bot...
|
||||||
|
2024-12-13 01:25:45,907 - MidasV1.InitialChecks - INFO - Checking Operating System...
|
||||||
|
2024-12-13 01:25:45,907 - MidasV1.InitialChecks - INFO - Operating System Linux is supported.
|
||||||
|
2024-12-13 01:25:45,907 - MidasV1.InitialChecks - INFO - Checking Dependencies...
|
||||||
|
2024-12-13 01:25:45,907 - MidasV1.InitialChecks - INFO - All dependencies are satisfied.
|
||||||
|
2024-12-13 01:25:45,907 - MidasV1.InitialChecks - INFO - Checking System Resources...
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO -
|
||||||
|
[93mCPU Cores: 12
|
||||||
|
CPU Clock Speed: 891.19 MHz
|
||||||
|
CPU Load Average (1m, 5m, 15m): (0.333984375, 0.28125, 0.21630859375)
|
||||||
|
CPU Threads: 12
|
||||||
|
Total RAM: 15.31 GB
|
||||||
|
Used RAM: 3.15 GB (27.7%)
|
||||||
|
Available RAM: 11.08 GB
|
||||||
|
Total Swap: 0.95 GB
|
||||||
|
Used Swap: 0.00 GB (0.0%)[0m
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO - Checking Connectivity with IB Gateway...
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO - IB Gateway Connection Test
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:25:45,908 - MidasV1.InitialChecks - INFO - Attempting to connect to IB Gateway at 127.0.0.1:4002...
|
||||||
|
2024-12-13 01:25:45,912 - MidasV1.InitialChecks - INFO - [INFO] Managed accounts: DUE064818
|
||||||
|
2024-12-13 01:25:45,912 - MidasV1.InitialChecks - INFO - [INFO] Next valid order ID: 1
|
||||||
|
2024-12-13 01:25:45,912 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2104, msg=Market data farm connection is OK:usfarm
|
||||||
|
2024-12-13 01:25:45,955 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2107, msg=HMDS data farm connection is inactive but should be available upon demand.ushmds
|
||||||
|
2024-12-13 01:25:45,955 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2158, msg=Sec-def data farm connection is OK:secdefnj
|
||||||
|
2024-12-13 01:25:46,011 - MidasV1.InitialChecks - INFO - [INFO] Connected successfully!
|
||||||
|
2024-12-13 01:25:46,011 - MidasV1.InitialChecks - INFO - Requesting account summary...
|
||||||
|
2024-12-13 01:25:46,041 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, BuyingPower = 4014612.28 USD
|
||||||
|
2024-12-13 01:25:46,042 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, EquityWithLoanValue = 1003653.07 USD
|
||||||
|
2024-12-13 01:25:46,042 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, NetLiquidation = 1004890.64 USD
|
||||||
|
2024-12-13 01:25:46,043 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, TotalCashValue = 1003653.07 USD
|
||||||
|
2024-12-13 01:25:46,043 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY END] ReqId: 1
|
||||||
|
2024-12-13 01:25:51,012 - MidasV1.InitialChecks - INFO - [INFO] Successfully retrieved account summary data.
|
||||||
|
2024-12-13 01:25:51,012 - MidasV1.InitialChecks - INFO - IB Gateway is connected and ready for upcoming modules.
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.InitialChecks - INFO - Test Complete
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.InitialChecks - INFO - All initial checks passed successfully.
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.StockListPetitioner - INFO - Starting Module 2: IBJTS List Petitioner...
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.StockListPetitioner - INFO - Loaded Scanner Criteria:
|
||||||
|
- Search Volume: 1000000
|
||||||
|
- Net Change: 0.5
|
||||||
|
- Percent Change: 2.0
|
||||||
|
2024-12-13 01:25:51,013 - MidasV1.StockListPetitioner - INFO - Initiating scanner subscription with the above criteria...
|
||||||
|
2024-12-13 01:25:51,014 - MidasV1.StockListPetitioner - INFO - Scanner subscription requested successfully on attempt 1.
|
||||||
|
2024-12-13 01:25:51,014 - MidasV1.StockListPetitioner - INFO - Waiting for scanner data (timeout in 15 seconds)...
|
||||||
|
2024-12-13 01:25:51,115 - MidasV1.InitialChecks - ERROR - [ERROR] id=1001, code=162, msg=Historical Market Data Service error message:Scanner type with code ALL is disabled
|
||||||
|
2024-12-13 01:26:06,014 - MidasV1.StockListPetitioner - ERROR - Scanner subscription timed out.
|
||||||
|
2024-12-13 01:26:06,015 - MidasV1.StockListPetitioner - ERROR - Scanner subscription canceled due to timeout.
|
||||||
|
2024-12-13 01:26:06,015 - MidasV1.Main - INFO - Refined Stock List: []
|
||||||
|
2024-12-13 01:26:06,015 - MidasV1.Main - INFO - Initial checks and Module 2 completed. Please review the logs and ensure everything is correct.
|
||||||
|
2024-12-13 01:26:06,017 - MidasV1.InitialChecks - ERROR - [ERROR] id=1001, code=365, msg=No scanner subscription found for ticker id:1001
|
||||||
|
2024-12-13 01:26:08,651 - MidasV1.Main - ERROR - Interrupt received. Shutting down gracefully...
|
||||||
|
2024-12-13 01:26:08,651 - MidasV1.InitialChecks - ERROR - [ERROR] Connection to IB Gateway was closed unexpectedly!
|
||||||
|
2024-12-13 01:26:08,651 - MidasV1.Main - INFO - Disconnected from IB Gateway.
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.Main - INFO - Starting MidasV1 Trading Bot...
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.InitialChecks - INFO - Checking Operating System...
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.InitialChecks - INFO - Operating System Linux is supported.
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.InitialChecks - INFO - Checking Dependencies...
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.InitialChecks - INFO - All dependencies are satisfied.
|
||||||
|
2024-12-13 01:40:53,290 - MidasV1.InitialChecks - INFO - Checking System Resources...
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO -
|
||||||
|
[93mCPU Cores: 12
|
||||||
|
CPU Clock Speed: 1745.26 MHz
|
||||||
|
CPU Load Average (1m, 5m, 15m): (0.775390625, 0.732421875, 0.5390625)
|
||||||
|
CPU Threads: 12
|
||||||
|
Total RAM: 15.31 GB
|
||||||
|
Used RAM: 2.93 GB (26.5%)
|
||||||
|
Available RAM: 11.26 GB
|
||||||
|
Total Swap: 0.95 GB
|
||||||
|
Used Swap: 0.00 GB (0.0%)[0m
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO - Checking Connectivity with IB Gateway...
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO - IB Gateway Connection Test
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:40:53,291 - MidasV1.InitialChecks - INFO - Attempting to connect to IB Gateway at 127.0.0.1:4002...
|
||||||
|
2024-12-13 01:40:53,335 - MidasV1.InitialChecks - INFO - [INFO] Managed accounts: DUE064818
|
||||||
|
2024-12-13 01:40:53,379 - MidasV1.InitialChecks - INFO - [INFO] Next valid order ID: 1
|
||||||
|
2024-12-13 01:40:53,379 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2104, msg=Market data farm connection is OK:usfarm
|
||||||
|
2024-12-13 01:40:53,379 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2106, msg=HMDS data farm connection is OK:ushmds
|
||||||
|
2024-12-13 01:40:53,379 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2158, msg=Sec-def data farm connection is OK:secdefnj
|
||||||
|
2024-12-13 01:40:53,429 - MidasV1.InitialChecks - INFO - [INFO] Connected successfully!
|
||||||
|
2024-12-13 01:40:53,430 - MidasV1.InitialChecks - INFO - Requesting account summary...
|
||||||
|
2024-12-13 01:40:53,591 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, BuyingPower = 4014612.28 USD
|
||||||
|
2024-12-13 01:40:53,592 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, EquityWithLoanValue = 1003653.07 USD
|
||||||
|
2024-12-13 01:40:53,592 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, NetLiquidation = 1004890.64 USD
|
||||||
|
2024-12-13 01:40:53,593 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, TotalCashValue = 1003653.07 USD
|
||||||
|
2024-12-13 01:40:53,594 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY END] ReqId: 1
|
||||||
|
2024-12-13 01:40:58,431 - MidasV1.InitialChecks - INFO - [INFO] Successfully retrieved account summary data.
|
||||||
|
2024-12-13 01:40:58,432 - MidasV1.InitialChecks - INFO - IB Gateway is connected and ready for upcoming modules.
|
||||||
|
2024-12-13 01:40:58,432 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:40:58,432 - MidasV1.InitialChecks - INFO - Test Complete
|
||||||
|
2024-12-13 01:40:58,433 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 01:40:58,433 - MidasV1.InitialChecks - INFO - All initial checks passed successfully.
|
||||||
|
2024-12-13 01:40:58,433 - MidasV1.StockListPetitioner - INFO - Starting Module 2: IBJTS List Petitioner...
|
||||||
|
2024-12-13 01:40:58,433 - MidasV1.StockListPetitioner - INFO - Loaded Scanner Criteria:
|
||||||
|
- Search Volume: 1000000
|
||||||
|
- Net Change: 0.5
|
||||||
|
- Percent Change: 2.0
|
||||||
|
2024-12-13 01:40:58,434 - MidasV1.StockListPetitioner - INFO - Initiating scanner subscription with the above criteria...
|
||||||
|
2024-12-13 01:40:58,434 - MidasV1.StockListPetitioner - INFO - Scanner subscription requested successfully on attempt 1.
|
||||||
|
2024-12-13 01:40:58,435 - MidasV1.StockListPetitioner - INFO - Waiting for scanner data (timeout in 15 seconds)...
|
||||||
|
2024-12-13 01:40:59,213 - MidasV1.InitialChecks - ERROR - [ERROR] id=1001, code=162, msg=Historical Market Data Service error message:Scanner type with code ALL is disabled
|
||||||
|
2024-12-13 01:41:13,435 - MidasV1.StockListPetitioner - ERROR - Scanner subscription timed out.
|
||||||
|
2024-12-13 01:41:13,435 - MidasV1.StockListPetitioner - ERROR - Scanner subscription canceled due to timeout.
|
||||||
|
2024-12-13 01:41:13,436 - MidasV1.Main - INFO - Refined Stock List: []
|
||||||
|
2024-12-13 01:41:13,436 - MidasV1.Main - INFO - Initial checks and Module 2 completed. Please review the logs and ensure everything is correct.
|
||||||
|
2024-12-13 01:41:13,439 - MidasV1.InitialChecks - ERROR - [ERROR] id=1001, code=365, msg=No scanner subscription found for ticker id:1001
|
||||||
|
2024-12-13 01:41:45,958 - MidasV1.Main - ERROR - Interrupt received. Shutting down gracefully...
|
||||||
|
2024-12-13 01:41:45,958 - MidasV1.InitialChecks - ERROR - [ERROR] Connection to IB Gateway was closed unexpectedly!
|
||||||
|
2024-12-13 01:41:45,958 - MidasV1.Main - INFO - Disconnected from IB Gateway.
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.Main - INFO - Starting MidasV1 Trading Bot...
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.InitialChecks - INFO - Checking Operating System...
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.InitialChecks - INFO - Operating System Linux is supported.
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.InitialChecks - INFO - Checking Dependencies...
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.InitialChecks - INFO - All dependencies are satisfied.
|
||||||
|
2024-12-13 02:13:25,804 - MidasV1.InitialChecks - INFO - Checking System Resources...
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO -
|
||||||
|
[93mCPU Cores: 12
|
||||||
|
CPU Clock Speed: 1030.11 MHz
|
||||||
|
CPU Load Average (1m, 5m, 15m): (0.7734375, 0.642578125, 0.6494140625)
|
||||||
|
CPU Threads: 12
|
||||||
|
Total RAM: 15.31 GB
|
||||||
|
Used RAM: 3.11 GB (27.6%)
|
||||||
|
Available RAM: 11.09 GB
|
||||||
|
Total Swap: 0.95 GB
|
||||||
|
Used Swap: 0.00 GB (0.0%)[0m
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO - Checking Connectivity with IB Gateway...
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO - IB Gateway Connection Test
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 02:13:25,805 - MidasV1.InitialChecks - INFO - Attempting to connect to IB Gateway at 127.0.0.1:4002...
|
||||||
|
2024-12-13 02:13:25,809 - MidasV1.InitialChecks - INFO - [INFO] Managed accounts: DUE064818
|
||||||
|
2024-12-13 02:13:25,851 - MidasV1.InitialChecks - INFO - [INFO] Next valid order ID: 1
|
||||||
|
2024-12-13 02:13:25,851 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2104, msg=Market data farm connection is OK:usfarm
|
||||||
|
2024-12-13 02:13:25,852 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2107, msg=HMDS data farm connection is inactive but should be available upon demand.ushmds
|
||||||
|
2024-12-13 02:13:25,852 - MidasV1.InitialChecks - INFO - [INFO/STATUS] id=-1, code=2158, msg=Sec-def data farm connection is OK:secdefnj
|
||||||
|
2024-12-13 02:13:25,908 - MidasV1.InitialChecks - INFO - [INFO] Connected successfully!
|
||||||
|
2024-12-13 02:13:25,909 - MidasV1.InitialChecks - INFO - Requesting account summary...
|
||||||
|
2024-12-13 02:13:25,940 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, BuyingPower = 4014612.28 USD
|
||||||
|
2024-12-13 02:13:25,940 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, EquityWithLoanValue = 1003653.07 USD
|
||||||
|
2024-12-13 02:13:25,941 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, NetLiquidation = 1004890.64 USD
|
||||||
|
2024-12-13 02:13:25,941 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY] ReqId:1, Account:DUE064818, TotalCashValue = 1003653.07 USD
|
||||||
|
2024-12-13 02:13:25,941 - MidasV1.InitialChecks - INFO - [ACCOUNT SUMMARY END] ReqId: 1
|
||||||
|
2024-12-13 02:13:30,910 - MidasV1.InitialChecks - INFO - [INFO] Successfully retrieved account summary data.
|
||||||
|
2024-12-13 02:13:30,910 - MidasV1.InitialChecks - INFO - IB Gateway is connected and ready for upcoming modules.
|
||||||
|
2024-12-13 02:13:30,911 - MidasV1.InitialChecks - INFO - ────────────────────────────────────────────────────
|
||||||
|
2024-12-13 02:13:30,911 - MidasV1.InitialChecks - INFO - Test Complete
|
||||||
|
2024-12-13 02:13:30,911 - MidasV1.Main - ERROR - An unexpected error occurred during initial checks or Module 2.
|
||||||
|
Traceback (most recent call last):
|
||||||
|
File "/home/klein/codeWS/Projects/MidasTechnologiesLLC/MidasTechnologies/src/MidasV1/main.py", line 194, in main
|
||||||
|
connected_client = initial_checks.run_all_checks(skip_checks=args.skip_checks, callback_handlers=[stock_petitioner]) # Pass as callback handler
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/home/klein/codeWS/Projects/MidasTechnologiesLLC/MidasTechnologies/src/MidasV1/modules/initial_checks.py", line 357, in run_all_checks
|
||||||
|
connected_client = self.check_connectivity(callback_handlers)
|
||||||
|
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||||
|
File "/home/klein/codeWS/Projects/MidasTechnologiesLLC/MidasTechnologies/src/MidasV1/modules/initial_checks.py", line 285, in check_connectivity
|
||||||
|
app.connect_and_run()
|
||||||
|
File "/home/klein/codeWS/Projects/MidasTechnologiesLLC/MidasTechnologies/src/MidasV1/modules/initial_checks.py", line 280, in connect_and_run
|
||||||
|
print(f"{MAGENTA}{BOLD} Test Complete{RESET}")
|
||||||
|
^^^^^^^
|
||||||
|
NameError: name 'MAGENTA' is not defined
|
||||||
|
2024-12-13 02:13:49,362 - MidasV1.Main - ERROR - Interrupt received. Shutting down gracefully...
|
||||||
261
src/MidasV1/main.py
Normal file
261
src/MidasV1/main.py
Normal file
@@ -0,0 +1,261 @@
|
|||||||
|
# main.py
|
||||||
|
"""
|
||||||
|
========================================================================
|
||||||
|
# README
|
||||||
|
#
|
||||||
|
# Program: main.py
|
||||||
|
#
|
||||||
|
# Description:
|
||||||
|
# This script serves as the entry point for the MidasV1 Trading Bot.
|
||||||
|
# It performs initial system checks, handles configuration loading, sets up logging,
|
||||||
|
# initializes modules, and manages the overall flow of the application.
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Parses command-line arguments for customization.
|
||||||
|
# - Loads configuration settings from a config file.
|
||||||
|
# - Sets up colored logging for enhanced visibility.
|
||||||
|
# - Handles graceful shutdown on interrupt signals (e.g., Ctrl+C).
|
||||||
|
# - Initializes and runs initial checks and the Stock List Petitioner module.
|
||||||
|
# - Provides placeholders for integrating additional modules in the future.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# Run the script from the command line with optional arguments:
|
||||||
|
# --no-checks : Run the program without prompting for user confirmation after initial checks.
|
||||||
|
# --skip-checks : Skip specific initial checks (primarily dependency checks).
|
||||||
|
# --verbose : Enable verbose and colorful output to the console.
|
||||||
|
# --version : Print the program version and exit.
|
||||||
|
#
|
||||||
|
# Example:
|
||||||
|
# python main.py --verbose
|
||||||
|
# python main.py --version
|
||||||
|
# python main.py --verbose --no-checks
|
||||||
|
# python main.py --verbose --skip-checks
|
||||||
|
#
|
||||||
|
# Coded by: kleinpanic 2024
|
||||||
|
========================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import logging
|
||||||
|
import sys
|
||||||
|
import os
|
||||||
|
import configparser
|
||||||
|
import signal
|
||||||
|
import threading
|
||||||
|
|
||||||
|
from modules.initial_checks import InitialChecks
|
||||||
|
from modules.stock_list_petitioner import StockListPetitioner # Import Module 2
|
||||||
|
|
||||||
|
class ColoredFormatter(logging.Formatter):
|
||||||
|
# ANSI escape codes for colors
|
||||||
|
COLOR_CODES = {
|
||||||
|
'DEBUG': "\033[94m", # Blue
|
||||||
|
'INFO': "\033[92m", # Green
|
||||||
|
'WARNING': "\033[93m", # Yellow
|
||||||
|
'ERROR': "\033[91m", # Red
|
||||||
|
'CRITICAL': "\033[95m", # Magenta
|
||||||
|
}
|
||||||
|
RESET_CODE = "\033[0m"
|
||||||
|
|
||||||
|
def format(self, record):
|
||||||
|
color = self.COLOR_CODES.get(record.levelname, self.RESET_CODE)
|
||||||
|
message = super().format(record)
|
||||||
|
if record.levelname in self.COLOR_CODES:
|
||||||
|
message = f"{color}{message}{self.RESET_CODE}"
|
||||||
|
return message
|
||||||
|
|
||||||
|
def setup_logging(verbose=False, log_level='INFO'):
|
||||||
|
"""
|
||||||
|
Configures logging for the application.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
verbose (bool): If True, set logging level to DEBUG and log to console with colors.
|
||||||
|
log_level (str): Specific logging level from config.
|
||||||
|
"""
|
||||||
|
log_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||||
|
log_file = os.path.join('logs', 'MidasV1.log')
|
||||||
|
|
||||||
|
# Create logs directory if it doesn't exist
|
||||||
|
os.makedirs('logs', exist_ok=True)
|
||||||
|
|
||||||
|
# Determine logging level
|
||||||
|
level = getattr(logging, log_level.upper(), logging.INFO)
|
||||||
|
if verbose:
|
||||||
|
level = logging.DEBUG
|
||||||
|
|
||||||
|
logger = logging.getLogger()
|
||||||
|
logger.setLevel(level)
|
||||||
|
|
||||||
|
# Remove existing handlers to prevent duplicate logs
|
||||||
|
if logger.hasHandlers():
|
||||||
|
logger.handlers.clear()
|
||||||
|
|
||||||
|
# File handler (uncolored, all logs)
|
||||||
|
file_handler = logging.FileHandler(log_file)
|
||||||
|
file_handler.setLevel(logging.DEBUG) # Log all levels to file
|
||||||
|
file_formatter = logging.Formatter(log_format)
|
||||||
|
file_handler.setFormatter(file_formatter)
|
||||||
|
logger.addHandler(file_handler)
|
||||||
|
|
||||||
|
if verbose:
|
||||||
|
# Console handler (colored, INFO and above)
|
||||||
|
console_handler = logging.StreamHandler(sys.stdout)
|
||||||
|
console_handler.setLevel(logging.INFO)
|
||||||
|
console_formatter = ColoredFormatter(log_format)
|
||||||
|
console_handler.setFormatter(console_formatter)
|
||||||
|
logger.addHandler(console_handler)
|
||||||
|
|
||||||
|
def load_config(config_path='config/config.config'):
|
||||||
|
"""
|
||||||
|
Loads the configuration file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
config_path (str): Path to the configuration file.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
configparser.ConfigParser: Parsed configuration.
|
||||||
|
"""
|
||||||
|
config = configparser.ConfigParser()
|
||||||
|
if not os.path.exists(config_path):
|
||||||
|
print(f"Configuration file not found at {config_path}. Exiting.")
|
||||||
|
sys.exit(1)
|
||||||
|
config.read(config_path)
|
||||||
|
return config
|
||||||
|
|
||||||
|
def parse_arguments():
|
||||||
|
"""
|
||||||
|
Parses command-line arguments.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
argparse.Namespace: Parsed arguments.
|
||||||
|
"""
|
||||||
|
parser = argparse.ArgumentParser(description='MidasV1 Trading Bot')
|
||||||
|
parser.add_argument('--no-checks', action='store_true', help='Run the program without prompting for user confirmation after initial checks')
|
||||||
|
parser.add_argument('--skip-checks', action='store_true', help='Skip specific initial checks (primarily dependency checks)')
|
||||||
|
parser.add_argument('--verbose', action='store_true', help='Enable verbose and colorful output to the console')
|
||||||
|
parser.add_argument('--version', action='store_true', help='Print the program version and exit')
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
# Initialize connected_client as global
|
||||||
|
connected_client = None
|
||||||
|
|
||||||
|
def signal_handler(sig, frame):
|
||||||
|
"""
|
||||||
|
Handles incoming signals for graceful shutdown.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sig (int): Signal number.
|
||||||
|
frame: Current stack frame.
|
||||||
|
"""
|
||||||
|
logger = logging.getLogger('MidasV1.Main')
|
||||||
|
logger.error("Interrupt received. Shutting down gracefully...")
|
||||||
|
global connected_client
|
||||||
|
if connected_client:
|
||||||
|
connected_client.disconnect()
|
||||||
|
logger.info("Disconnected from IB Gateway.")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
def main():
|
||||||
|
global connected_client # Declare as global to modify the global variable
|
||||||
|
|
||||||
|
# Register the signal handler for graceful shutdown
|
||||||
|
signal.signal(signal.SIGINT, signal_handler)
|
||||||
|
signal.signal(signal.SIGTERM, signal_handler)
|
||||||
|
|
||||||
|
# Parse command-line arguments
|
||||||
|
args = parse_arguments()
|
||||||
|
|
||||||
|
# Load configuration
|
||||||
|
config = load_config()
|
||||||
|
|
||||||
|
# Setup logging based on flags and config
|
||||||
|
log_level = config.get('Logging', 'level', fallback='INFO')
|
||||||
|
setup_logging(verbose=args.verbose, log_level=log_level)
|
||||||
|
logger = logging.getLogger('MidasV1.Main')
|
||||||
|
|
||||||
|
# Suppress ibapi internal logs from propagating to root logger
|
||||||
|
ibapi_logger = logging.getLogger('ibapi')
|
||||||
|
ibapi_logger.setLevel(logging.WARNING) # Suppress DEBUG and INFO
|
||||||
|
ibapi_logger.propagate = False
|
||||||
|
|
||||||
|
# Handle --version flag
|
||||||
|
if args.version:
|
||||||
|
version = config.get('General', 'version', fallback='1.0.0')
|
||||||
|
print(f"MidasV1 Version: {version}")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
logger.info("Starting MidasV1 Trading Bot...")
|
||||||
|
|
||||||
|
# Initialize and run initial checks if not skipping all checks
|
||||||
|
if not args.skip_checks:
|
||||||
|
try:
|
||||||
|
initial_checks = InitialChecks(config, verbose=args.verbose)
|
||||||
|
stock_petitioner = StockListPetitioner(config) # Instantiate with only config
|
||||||
|
connected_client = initial_checks.run_all_checks(skip_checks=args.skip_checks, callback_handlers=[stock_petitioner]) # Pass as callback handler
|
||||||
|
stock_petitioner.set_client(connected_client) # Set the connected client
|
||||||
|
refined_stock_list = stock_petitioner.run_module()
|
||||||
|
logger.info(f"Refined Stock List: {refined_stock_list}") # Log to prevent unused warning
|
||||||
|
except SystemExit as e:
|
||||||
|
logger.error("Initial checks failed. Exiting program.")
|
||||||
|
sys.exit(e.code)
|
||||||
|
except Exception as e:
|
||||||
|
logger.exception("An unexpected error occurred during initial checks or Module 2.")
|
||||||
|
if connected_client:
|
||||||
|
connected_client.disconnect()
|
||||||
|
logger.info("Disconnected from IB Gateway.")
|
||||||
|
sys.exit(1)
|
||||||
|
else:
|
||||||
|
logger.warning("Skipping specific initial checks as per the '--skip-checks' flag.")
|
||||||
|
refined_stock_list = []
|
||||||
|
|
||||||
|
# Prompt the user to confirm before proceeding to the next module
|
||||||
|
if not args.no_checks:
|
||||||
|
logger.info("Initial checks and Module 2 completed. Please review the logs and ensure everything is correct.")
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
user_input = input("Do you want to proceed to the next module? (y/n): ").strip().lower()
|
||||||
|
if user_input == 'y':
|
||||||
|
logger.info("User chose to proceed.")
|
||||||
|
break
|
||||||
|
elif user_input == 'n':
|
||||||
|
logger.info("User chose to exit the program.")
|
||||||
|
if connected_client:
|
||||||
|
connected_client.disconnect()
|
||||||
|
logger.info("Disconnected from IB Gateway.")
|
||||||
|
sys.exit(0)
|
||||||
|
else:
|
||||||
|
print("Please enter 'y' or 'n'.")
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
logger.error("Interrupt received during user prompt. Shutting down gracefully...")
|
||||||
|
if connected_client:
|
||||||
|
connected_client.disconnect()
|
||||||
|
logger.info("Disconnected from IB Gateway.")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
else:
|
||||||
|
logger.info("Proceeding to the next module without user confirmation as per the '--no-checks' flag.")
|
||||||
|
|
||||||
|
# Placeholder for initializing and running other modules (e.g., Module 3)
|
||||||
|
# Example:
|
||||||
|
# from modules.module3 import Module3
|
||||||
|
# module3 = Module3(config, connected_client, refined_stock_list)
|
||||||
|
# module3.run()
|
||||||
|
|
||||||
|
logger.info("MidasV1 Trading Bot is now running.")
|
||||||
|
|
||||||
|
# Placeholder for main loop or orchestration logic
|
||||||
|
try:
|
||||||
|
while True:
|
||||||
|
# Implement the main functionality here
|
||||||
|
# For demonstration, we'll just sleep
|
||||||
|
threading.Event().wait(1)
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
logger.error("Interrupt received. Shutting down gracefully...")
|
||||||
|
if connected_client:
|
||||||
|
connected_client.disconnect()
|
||||||
|
logger.info("Disconnected from IB Gateway.")
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
|
|
||||||
1
src/MidasV1/modules/.gitignore
vendored
Normal file
1
src/MidasV1/modules/.gitignore
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
backups/
|
||||||
BIN
src/MidasV1/modules/__pycache__/initial_checks.cpython-311.pyc
Normal file
BIN
src/MidasV1/modules/__pycache__/initial_checks.cpython-311.pyc
Normal file
Binary file not shown.
Binary file not shown.
312
src/MidasV1/modules/initial_checks.py
Normal file
312
src/MidasV1/modules/initial_checks.py
Normal file
@@ -0,0 +1,312 @@
|
|||||||
|
# modules/initial_checks.py
|
||||||
|
"""
|
||||||
|
========================================================================
|
||||||
|
# README
|
||||||
|
#
|
||||||
|
# Module: initial_checks.py
|
||||||
|
#
|
||||||
|
# Description:
|
||||||
|
# This module performs a series of initial system and environment checks
|
||||||
|
# required before the MidasV1 Trading Bot can operate effectively.
|
||||||
|
# It verifies the operating system, checks for necessary dependencies,
|
||||||
|
# assesses system resources, and ensures connectivity with the Interactive Brokers (IB) Gateway.
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Checks if the operating system is supported (currently Linux).
|
||||||
|
# - Verifies that required Python packages (`ibapi`, `psutil`) are installed.
|
||||||
|
# - Logs detailed system resource information including CPU cores, clock speed,
|
||||||
|
# load averages, CPU threads, and RAM statistics.
|
||||||
|
# - Tests connectivity with the IB Gateway by attempting to establish a session
|
||||||
|
# and retrieve account summaries.
|
||||||
|
# - Integrates with other modules via callback handlers to facilitate data exchange.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# This module is primarily used by `main.py` during the startup phase.
|
||||||
|
#
|
||||||
|
# Example:
|
||||||
|
# from modules.initial_checks import InitialChecks
|
||||||
|
# config = load_config()
|
||||||
|
# initial_checks = InitialChecks(config, verbose=True)
|
||||||
|
# connected_client = initial_checks.run_all_checks(callback_handlers=[stock_petitioner])
|
||||||
|
#
|
||||||
|
# Coded by: kleinpanic 2024
|
||||||
|
========================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
import platform
|
||||||
|
import sys
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import psutil
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
|
||||||
|
from ibapi.client import EClient
|
||||||
|
from ibapi.wrapper import EWrapper
|
||||||
|
from ibapi.utils import iswrapper
|
||||||
|
|
||||||
|
from ibapi.contract import Contract
|
||||||
|
from ibapi.account_summary_tags import AccountSummaryTags # Ensure correct import
|
||||||
|
|
||||||
|
from ibapi.tag_value import TagValue
|
||||||
|
|
||||||
|
SEPARATOR = "────────────────────────────────────────────────────"
|
||||||
|
|
||||||
|
class InitialChecks:
|
||||||
|
def __init__(self, config, verbose=False):
|
||||||
|
self.config = config
|
||||||
|
self.verbose = verbose
|
||||||
|
self.logger = logging.getLogger('MidasV1.InitialChecks')
|
||||||
|
|
||||||
|
def check_os(self):
|
||||||
|
"""
|
||||||
|
Determines the operating system and verifies if it's supported.
|
||||||
|
Currently supports only Linux.
|
||||||
|
"""
|
||||||
|
self.logger.info("Checking Operating System...")
|
||||||
|
os_type = platform.system()
|
||||||
|
if os_type != 'Linux':
|
||||||
|
message = f"Unsupported Operating System: {os_type}"
|
||||||
|
self.logger.error(message)
|
||||||
|
self.logger.warning("Future support for other operating systems is being added.")
|
||||||
|
sys.exit(1)
|
||||||
|
success_message = f"Operating System {os_type} is supported."
|
||||||
|
self.logger.info(success_message)
|
||||||
|
|
||||||
|
def check_dependencies(self):
|
||||||
|
"""
|
||||||
|
Ensures that all necessary dependencies are installed.
|
||||||
|
"""
|
||||||
|
self.logger.info("Checking Dependencies...")
|
||||||
|
# Check if 'ibapi' and 'psutil' are installed
|
||||||
|
dependencies = ['ibapi', 'psutil']
|
||||||
|
missing = []
|
||||||
|
for dep in dependencies:
|
||||||
|
try:
|
||||||
|
__import__(dep)
|
||||||
|
except ImportError:
|
||||||
|
missing.append(dep)
|
||||||
|
if missing:
|
||||||
|
error_message = f"Missing Dependencies: {', '.join(missing)}"
|
||||||
|
self.logger.error(error_message)
|
||||||
|
self.logger.warning("Please install the missing dependencies and try again.")
|
||||||
|
sys.exit(1)
|
||||||
|
success_message = "All dependencies are satisfied."
|
||||||
|
self.logger.info(success_message)
|
||||||
|
|
||||||
|
def check_connectivity(self, callback_handlers=[]):
|
||||||
|
"""
|
||||||
|
Verifies a secure connection with the IB Gateway.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
callback_handlers (list): List of modules to receive callbacks.
|
||||||
|
"""
|
||||||
|
self.logger.info("Checking Connectivity with IB Gateway...")
|
||||||
|
|
||||||
|
host = self.config.get('Connectivity', 'host', fallback='127.0.0.1')
|
||||||
|
port = self.config.getint('Connectivity', 'port', fallback=4002)
|
||||||
|
client_id = self.config.getint('Connectivity', 'client_id', fallback=0)
|
||||||
|
|
||||||
|
# Define wrapper and client for connection test
|
||||||
|
class TestWrapper(EWrapper):
|
||||||
|
def __init__(self, logger, callback_handlers):
|
||||||
|
super().__init__()
|
||||||
|
self.nextValidOrderId = None
|
||||||
|
self.connected_flag = False
|
||||||
|
self.received_account_summary = False
|
||||||
|
self.logger = logger
|
||||||
|
self.callback_handlers = callback_handlers
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def error(self, reqId, errorCode, errorString, advancedOrderRejectJson=""):
|
||||||
|
info_codes = {2104, 2106, 2107, 2158} # Include code 2106 based on your logs
|
||||||
|
if errorCode in info_codes:
|
||||||
|
self.logger.info(f"[INFO/STATUS] id={reqId}, code={errorCode}, msg={errorString}")
|
||||||
|
else:
|
||||||
|
self.logger.error(f"[ERROR] id={reqId}, code={errorCode}, msg={errorString}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def nextValidId(self, orderId: int):
|
||||||
|
self.logger.info(f"[INFO] Next valid order ID: {orderId}")
|
||||||
|
self.nextValidOrderId = orderId
|
||||||
|
self.connected_flag = True
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def accountSummary(self, reqId: int, account: str, tag: str, value: str, currency: str):
|
||||||
|
self.received_account_summary = True
|
||||||
|
self.logger.info(f"[ACCOUNT SUMMARY] ReqId:{reqId}, Account:{account}, {tag} = {value} {currency}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def accountSummaryEnd(self, reqId: int):
|
||||||
|
self.logger.info(f"[ACCOUNT SUMMARY END] ReqId: {reqId}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def managedAccounts(self, accountsList: str):
|
||||||
|
self.logger.info(f"[INFO] Managed accounts: {accountsList}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def connectionClosed(self):
|
||||||
|
self.logger.error("[ERROR] Connection to IB Gateway was closed unexpectedly!")
|
||||||
|
# Notify callback handlers if needed
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def scannerData(self, reqId: int, rank: int, contractDetails: Contract, distance: str,
|
||||||
|
benchmark: str, projection: str, legsStr: str):
|
||||||
|
# Dispatch to callback handlers
|
||||||
|
for handler in self.callback_handlers:
|
||||||
|
handler.scannerData(reqId, rank, contractDetails, distance, benchmark, projection, legsStr)
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def scannerDataEnd(self, reqId: int):
|
||||||
|
self.logger.info(f"Scanner data end received for reqId: {reqId}")
|
||||||
|
for handler in self.callback_handlers:
|
||||||
|
handler.scannerDataEnd(reqId)
|
||||||
|
|
||||||
|
class TestClient(EClient):
|
||||||
|
def __init__(self, wrapper):
|
||||||
|
super().__init__(wrapper)
|
||||||
|
|
||||||
|
class ConnectionTestApp(TestWrapper, TestClient):
|
||||||
|
def __init__(self, host: str, port: int, client_id: int, logger, callback_handlers=[]):
|
||||||
|
TestWrapper.__init__(self, logger, callback_handlers)
|
||||||
|
TestClient.__init__(self, self)
|
||||||
|
self.host = host
|
||||||
|
self.port = port
|
||||||
|
self.client_id = client_id
|
||||||
|
|
||||||
|
def connect_and_run(self):
|
||||||
|
self.logger.info(SEPARATOR)
|
||||||
|
self.logger.info(" IB Gateway Connection Test")
|
||||||
|
self.logger.info(SEPARATOR)
|
||||||
|
connection_message = f"Attempting to connect to IB Gateway at {self.host}:{self.port}..."
|
||||||
|
self.logger.info(connection_message)
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.connect(self.host, self.port, self.client_id)
|
||||||
|
except ConnectionRefusedError:
|
||||||
|
error_message = "[ERROR] Connection refused. Is IB Gateway running?"
|
||||||
|
self.logger.error(error_message)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
# Start the EClient message processing thread
|
||||||
|
thread = threading.Thread(target=self.run, daemon=True)
|
||||||
|
thread.start()
|
||||||
|
|
||||||
|
# Wait until connected or timeout
|
||||||
|
start_time = time.time()
|
||||||
|
timeout = 5 # seconds
|
||||||
|
while not self.connected_flag and (time.time() - start_time < timeout):
|
||||||
|
time.sleep(0.1)
|
||||||
|
|
||||||
|
if not self.connected_flag:
|
||||||
|
error_message = "[ERROR] Connection not established within timeout."
|
||||||
|
self.logger.error(error_message)
|
||||||
|
warning_message = "[WARN] No connection. Check Gateway settings and try again."
|
||||||
|
self.logger.warning(warning_message)
|
||||||
|
self.disconnect()
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
success_message = "[INFO] Connected successfully!"
|
||||||
|
self.logger.info(success_message)
|
||||||
|
|
||||||
|
request_message = "Requesting account summary..."
|
||||||
|
self.logger.info(request_message)
|
||||||
|
|
||||||
|
# Request account summary to verify further communication
|
||||||
|
req_id = 1
|
||||||
|
self.reqAccountSummary(req_id, "All", "NetLiquidation,TotalCashValue,EquityWithLoanValue,BuyingPower")
|
||||||
|
|
||||||
|
# Wait a bit for responses
|
||||||
|
time.sleep(5)
|
||||||
|
self.cancelAccountSummary(req_id)
|
||||||
|
|
||||||
|
# Check if we received account summary data
|
||||||
|
if self.received_account_summary:
|
||||||
|
success_summary = "[INFO] Successfully retrieved account summary data."
|
||||||
|
self.logger.info(success_summary)
|
||||||
|
else:
|
||||||
|
warning_summary = "[WARN] Connected but did not receive account summary data. Is the account funded or available?"
|
||||||
|
self.logger.warning(warning_summary)
|
||||||
|
|
||||||
|
self.logger.info("IB Gateway is connected and ready for upcoming modules.")
|
||||||
|
self.logger.info(SEPARATOR)
|
||||||
|
self.logger.info(" Test Complete")
|
||||||
|
self.logger.info(SEPARATOR)
|
||||||
|
|
||||||
|
# Initialize and run the connection test
|
||||||
|
app = ConnectionTestApp(host, port, client_id, self.logger, callback_handlers)
|
||||||
|
app.connect_and_run()
|
||||||
|
|
||||||
|
# Return the app to keep the connection open
|
||||||
|
return app
|
||||||
|
|
||||||
|
def check_system_resources(self):
|
||||||
|
"""
|
||||||
|
Logs system resource information such as CPU cores, clock speed, load averages, CPU threads, and detailed RAM information.
|
||||||
|
The system information is colored gold for better visibility.
|
||||||
|
"""
|
||||||
|
self.logger.info("Checking System Resources...")
|
||||||
|
|
||||||
|
# Gather system information
|
||||||
|
cpu_cores = psutil.cpu_count(logical=True)
|
||||||
|
cpu_freq = psutil.cpu_freq()
|
||||||
|
cpu_load_avg = psutil.getloadavg()
|
||||||
|
cpu_threads = psutil.cpu_count(logical=True)
|
||||||
|
ram = psutil.virtual_memory()
|
||||||
|
swap = psutil.swap_memory()
|
||||||
|
|
||||||
|
total_ram_gb = ram.total / (1024 ** 3)
|
||||||
|
used_ram_gb = ram.used / (1024 ** 3)
|
||||||
|
available_ram_gb = ram.available / (1024 ** 3)
|
||||||
|
ram_percent = ram.percent
|
||||||
|
|
||||||
|
swap_total_gb = swap.total / (1024 ** 3)
|
||||||
|
swap_used_gb = swap.used / (1024 ** 3)
|
||||||
|
swap_percent = swap.percent
|
||||||
|
|
||||||
|
# Construct the resource information string
|
||||||
|
resource_info = (
|
||||||
|
f"CPU Cores: {cpu_cores}\n"
|
||||||
|
f"CPU Clock Speed: {cpu_freq.current:.2f} MHz\n"
|
||||||
|
f"CPU Load Average (1m, 5m, 15m): {cpu_load_avg}\n"
|
||||||
|
f"CPU Threads: {cpu_threads}\n"
|
||||||
|
f"Total RAM: {total_ram_gb:.2f} GB\n"
|
||||||
|
f"Used RAM: {used_ram_gb:.2f} GB ({ram_percent}%)\n"
|
||||||
|
f"Available RAM: {available_ram_gb:.2f} GB\n"
|
||||||
|
f"Total Swap: {swap_total_gb:.2f} GB\n"
|
||||||
|
f"Used Swap: {swap_used_gb:.2f} GB ({swap_percent}%)"
|
||||||
|
)
|
||||||
|
|
||||||
|
# ANSI escape code for gold (approximated by yellow)
|
||||||
|
gold_color = "\033[93m" # Bright Yellow as gold approximation
|
||||||
|
reset_color = "\033[0m"
|
||||||
|
|
||||||
|
# Combine the newline and colored resource information
|
||||||
|
colored_resource_info = f"\n{gold_color}{resource_info}{reset_color}"
|
||||||
|
|
||||||
|
# Log the colored resource information
|
||||||
|
self.logger.info(colored_resource_info)
|
||||||
|
|
||||||
|
def run_all_checks(self, skip_checks=False, callback_handlers=[]):
|
||||||
|
"""
|
||||||
|
Executes all initial checks in the required sequence.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
skip_checks (bool): If True, skips specific checks like dependency checks.
|
||||||
|
callback_handlers (list): List of modules to receive callbacks.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
connected_client: The connected ibapi client to be used by other modules.
|
||||||
|
"""
|
||||||
|
self.check_os()
|
||||||
|
if not skip_checks:
|
||||||
|
self.check_dependencies()
|
||||||
|
else:
|
||||||
|
warning_message = "Skipping dependency checks as per the '--skip-checks' flag."
|
||||||
|
self.logger.warning(warning_message)
|
||||||
|
self.check_system_resources()
|
||||||
|
connected_client = self.check_connectivity(callback_handlers)
|
||||||
|
success_message = "All initial checks passed successfully."
|
||||||
|
self.logger.info(success_message)
|
||||||
|
return connected_client
|
||||||
|
|
||||||
444
src/MidasV1/modules/stock_list_petitioner.py
Normal file
444
src/MidasV1/modules/stock_list_petitioner.py
Normal file
@@ -0,0 +1,444 @@
|
|||||||
|
# modules/stock_list_petitioner.py
|
||||||
|
"""
|
||||||
|
========================================================================
|
||||||
|
# README
|
||||||
|
#
|
||||||
|
# Module: stock_list_petitioner.py
|
||||||
|
#
|
||||||
|
# Description:
|
||||||
|
# This module handles the process of requesting and refining a list of stock symbols
|
||||||
|
# based on predefined criteria. It interacts with the Interactive Brokers (IB) API
|
||||||
|
# to perform scanner subscriptions, receive market data, and apply filters to
|
||||||
|
# generate a refined list of stocks suitable for trading strategies.
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Initiates scanner subscriptions to retrieve stock data based on criteria such as
|
||||||
|
# search volume and net change.
|
||||||
|
# - Receives and processes scanner data asynchronously using IB API callbacks.
|
||||||
|
# - Refines the received stock list by applying additional criteria like share price,
|
||||||
|
# availability of option contracts, and volatility index.
|
||||||
|
# - Caches the refined stock list for use by subsequent modules.
|
||||||
|
# - Provides detailed logging and colored console outputs for better traceability and user feedback.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# This module is instantiated and used by `main.py` after initial checks are passed.
|
||||||
|
#
|
||||||
|
# Example:
|
||||||
|
# from modules.stock_list_petitioner import StockListPetitioner
|
||||||
|
# stock_petitioner = StockListPetitioner(config)
|
||||||
|
# connected_client = initial_checks.run_all_checks(callback_handlers=[stock_petitioner])
|
||||||
|
# refined_stock_list = stock_petitioner.run_module()
|
||||||
|
#
|
||||||
|
# Coded by: kleinpanic 2024
|
||||||
|
========================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import threading
|
||||||
|
import tempfile
|
||||||
|
import json
|
||||||
|
import time
|
||||||
|
import os # Added import for os
|
||||||
|
|
||||||
|
from ibapi.contract import Contract
|
||||||
|
from ibapi.scanner import ScannerSubscription
|
||||||
|
from ibapi.ticktype import TickTypeEnum
|
||||||
|
from ibapi.utils import iswrapper
|
||||||
|
from ibapi.tag_value import TagValue
|
||||||
|
|
||||||
|
class StockListPetitioner:
|
||||||
|
def __init__(self, config):
|
||||||
|
self.logger = logging.getLogger('MidasV1.StockListPetitioner')
|
||||||
|
self.config = config
|
||||||
|
self.connected_client = None # To be set later
|
||||||
|
|
||||||
|
# Scanner results
|
||||||
|
self.scanner_data = []
|
||||||
|
self.scanner_finished = False
|
||||||
|
self.lock = threading.Lock()
|
||||||
|
|
||||||
|
# Event to signal when scanner data is received
|
||||||
|
self.scanner_event = threading.Event()
|
||||||
|
|
||||||
|
def set_client(self, connected_client):
|
||||||
|
"""
|
||||||
|
Sets the connected_client after initial checks.
|
||||||
|
"""
|
||||||
|
self.connected_client = connected_client
|
||||||
|
|
||||||
|
def run_module(self):
|
||||||
|
"""
|
||||||
|
Executes the scanner subscription and refines the stock list.
|
||||||
|
"""
|
||||||
|
if not self.connected_client:
|
||||||
|
self.logger.error("Connected client is not set. Cannot proceed with scanner subscription.")
|
||||||
|
return []
|
||||||
|
|
||||||
|
self.logger.info("Starting Module 2: IBJTS List Petitioner...")
|
||||||
|
|
||||||
|
# Load scanner criteria from config
|
||||||
|
search_volume = self.config.getint('Module2', 'default_search_volume', fallback=10000)
|
||||||
|
net_change = self.config.getfloat('Module2', 'default_net_change', fallback=0.0)
|
||||||
|
percent_change = self.config.getfloat('Module2', 'default_percent_change', fallback=0.0)
|
||||||
|
|
||||||
|
# Display and log criteria
|
||||||
|
criteria_message = (
|
||||||
|
f"Loaded Scanner Criteria:\n"
|
||||||
|
f" - Search Volume: {search_volume}\n"
|
||||||
|
f" - Net Change: {net_change}\n"
|
||||||
|
f" - Percent Change: {percent_change}"
|
||||||
|
)
|
||||||
|
self.logger.info(criteria_message)
|
||||||
|
print("\033[94m" + criteria_message + "\033[0m") # Blue text for criteria
|
||||||
|
|
||||||
|
# Define the scanner subscription
|
||||||
|
subscription = ScannerSubscription()
|
||||||
|
subscription.instrument = "STK"
|
||||||
|
subscription.locationCode = "STK.US.MAJOR" # Broad location code to include major US stocks
|
||||||
|
subscription.scanCode = "ALL" # Broader scan code to include all stocks
|
||||||
|
subscription.aboveVolume = search_volume
|
||||||
|
# subscription.netChange = net_change # Removed for compatibility with "ALL"
|
||||||
|
# subscription.percentChange = percent_change # Removed for compatibility with "ALL"
|
||||||
|
|
||||||
|
# Inform user about the API call
|
||||||
|
api_call_message = "Initiating scanner subscription with the above criteria..."
|
||||||
|
self.logger.info(api_call_message)
|
||||||
|
print("\033[92m" + api_call_message + "\033[0m") # Green text for API call info
|
||||||
|
|
||||||
|
# Optionally, implement retries
|
||||||
|
MAX_RETRIES = 2
|
||||||
|
for attempt in range(1, MAX_RETRIES + 1):
|
||||||
|
try:
|
||||||
|
self.connected_client.reqScannerSubscription(
|
||||||
|
reqId=1001,
|
||||||
|
subscription=subscription,
|
||||||
|
scannerSubscriptionOptions=[], # Can be extended based on config
|
||||||
|
scannerSubscriptionFilterOptions=[] # Can be extended based on config
|
||||||
|
)
|
||||||
|
self.logger.info(f"Scanner subscription requested successfully on attempt {attempt}.")
|
||||||
|
print(f"\033[92mScanner subscription requested successfully on attempt {attempt}.\033[0m") # Green text
|
||||||
|
break
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Attempt {attempt}: Error in reqScannerSubscription: {e}")
|
||||||
|
print(f"\033[91mAttempt {attempt}: Error in reqScannerSubscription: {e}\033[0m") # Red text
|
||||||
|
if attempt == MAX_RETRIES:
|
||||||
|
return []
|
||||||
|
time.sleep(2 ** attempt) # Exponential backoff
|
||||||
|
|
||||||
|
# Wait for scanner data or timeout
|
||||||
|
scanner_timeout = 15 # seconds
|
||||||
|
self.logger.info(f"Waiting for scanner data (timeout in {scanner_timeout} seconds)...")
|
||||||
|
print(f"\033[93mWaiting for scanner data (timeout in {scanner_timeout} seconds)...\033[0m") # Yellow text
|
||||||
|
scanner_completed = self.scanner_event.wait(timeout=scanner_timeout)
|
||||||
|
if not scanner_completed:
|
||||||
|
self.logger.error("Scanner subscription timed out.")
|
||||||
|
try:
|
||||||
|
self.connected_client.cancelScannerSubscription(1001)
|
||||||
|
self.logger.error("Scanner subscription canceled due to timeout.")
|
||||||
|
print("\033[91mScanner subscription timed out and was canceled.\033[0m") # Red text
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Error canceling scanner subscription: {e}")
|
||||||
|
print(f"\033[91mError canceling scanner subscription: {e}\033[0m") # Red text
|
||||||
|
return []
|
||||||
|
|
||||||
|
self.logger.info("Scanner data received. Proceeding to refine the stock list.")
|
||||||
|
print("\033[92mScanner data received. Proceeding to refine the stock list.\033[0m") # Green text
|
||||||
|
|
||||||
|
# Log the number of scanner data entries received
|
||||||
|
self.logger.debug(f"Total scanner data received: {len(self.scanner_data)}")
|
||||||
|
print(f"\033[94mTotal scanner data received: {len(self.scanner_data)}\033[0m") # Blue text
|
||||||
|
|
||||||
|
for stock in self.scanner_data:
|
||||||
|
self.logger.debug(f"Stock: {stock['symbol']}, Volume: {stock['distance']}")
|
||||||
|
# Optionally, print detailed scanner data for debugging
|
||||||
|
# print(f"\033[94mStock: {stock['symbol']}, Volume: {stock['distance']}\033[0m") # Blue text
|
||||||
|
|
||||||
|
# Process and refine the scanner data
|
||||||
|
refined_list = self.refine_stock_list()
|
||||||
|
|
||||||
|
# Cache the refined list
|
||||||
|
self.cache_refined_list(refined_list)
|
||||||
|
|
||||||
|
# Print the refined list to the user
|
||||||
|
self.print_refined_list(refined_list)
|
||||||
|
|
||||||
|
if not refined_list:
|
||||||
|
self.logger.error("No stocks meet the specified criteria after refinement.")
|
||||||
|
print("\033[91mNo stocks meet the specified criteria after refinement.\033[0m\n") # Red text
|
||||||
|
|
||||||
|
# Disconnect the client to prevent further logging
|
||||||
|
try:
|
||||||
|
self.connected_client.disconnect()
|
||||||
|
self.logger.info("Disconnected from IB Gateway after Module 2.")
|
||||||
|
print("\033[92mDisconnected from IB Gateway after Module 2.\033[0m") # Green text
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Error disconnecting from IB Gateway: {e}")
|
||||||
|
print(f"\033[91mError disconnecting from IB Gateway: {e}\033[0m") # Red text
|
||||||
|
|
||||||
|
self.logger.info("Module 2: IBJTS List Petitioner completed successfully.")
|
||||||
|
print("\033[92mModule 2: IBJTS List Petitioner completed successfully.\033[0m") # Green text
|
||||||
|
|
||||||
|
return refined_list
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def scannerData(self, reqId: int, rank: int, contractDetails, distance: str,
|
||||||
|
benchmark: str, projection: str, legsStr: str):
|
||||||
|
"""
|
||||||
|
Receives scanner data.
|
||||||
|
"""
|
||||||
|
with self.lock:
|
||||||
|
self.scanner_data.append({
|
||||||
|
'rank': rank,
|
||||||
|
'symbol': contractDetails.contract.symbol,
|
||||||
|
'sectype': contractDetails.contract.secType,
|
||||||
|
'exchange': contractDetails.contract.exchange,
|
||||||
|
'currency': contractDetails.contract.currency,
|
||||||
|
'distance': distance,
|
||||||
|
'benchmark': benchmark,
|
||||||
|
'projection': projection,
|
||||||
|
'legsStr': legsStr
|
||||||
|
})
|
||||||
|
self.logger.debug(f"Received scanner data: {self.scanner_data[-1]}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def scannerDataEnd(self, reqId: int):
|
||||||
|
"""
|
||||||
|
Indicates the end of scanner data.
|
||||||
|
"""
|
||||||
|
self.logger.info(f"Scanner data end received for reqId: {reqId}")
|
||||||
|
with self.lock:
|
||||||
|
self.scanner_finished = True
|
||||||
|
self.scanner_event.set()
|
||||||
|
|
||||||
|
def refine_stock_list(self):
|
||||||
|
"""
|
||||||
|
Refines the scanner data based on additional criteria.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
list: Refined list of stocks.
|
||||||
|
"""
|
||||||
|
self.logger.info("Refining the stock list based on criteria...")
|
||||||
|
print("\033[93mRefining the stock list based on criteria...\033[0m") # Yellow text
|
||||||
|
|
||||||
|
refined_list = []
|
||||||
|
for stock in self.scanner_data:
|
||||||
|
symbol = stock['symbol']
|
||||||
|
self.logger.debug(f"Processing stock: {symbol}")
|
||||||
|
print(f"\033[94mProcessing stock: {symbol}\033[0m") # Blue text
|
||||||
|
|
||||||
|
# Fetch additional data for each stock
|
||||||
|
share_price = self.get_share_price(symbol)
|
||||||
|
if share_price is None:
|
||||||
|
self.logger.debug(f"Skipping {symbol}: Unable to retrieve share price.")
|
||||||
|
print(f"\033[91mSkipping {symbol}: Unable to retrieve share price.\033[0m") # Red text
|
||||||
|
continue # Skip if unable to fetch share price
|
||||||
|
|
||||||
|
if share_price > self.config.getfloat('Module2', 'default_refinement_share_price', fallback=15.0):
|
||||||
|
self.logger.debug(f"Excluding {symbol}: Share price ${share_price} exceeds threshold.")
|
||||||
|
print(f"\033[91mExcluding {symbol}: Share price ${share_price} exceeds threshold.\033[0m") # Red text
|
||||||
|
continue # Remove stocks above the share price threshold
|
||||||
|
|
||||||
|
if not self.has_option_contracts(symbol):
|
||||||
|
self.logger.debug(f"Excluding {symbol}: No option contracts available.")
|
||||||
|
print(f"\033[91mExcluding {symbol}: No option contracts available.\033[0m") # Red text
|
||||||
|
continue # Remove stocks without option contracts
|
||||||
|
|
||||||
|
volatility_index = self.get_volatility_index(symbol)
|
||||||
|
if volatility_index is None:
|
||||||
|
self.logger.debug(f"Skipping {symbol}: Unable to retrieve volatility index.")
|
||||||
|
print(f"\033[91mSkipping {symbol}: Unable to retrieve volatility index.\033[0m") # Red text
|
||||||
|
continue # Skip if unable to fetch volatility index
|
||||||
|
|
||||||
|
if volatility_index > self.config.getfloat('Module2', 'default_volatility_threshold', fallback=30.0):
|
||||||
|
self.logger.debug(f"Excluding {symbol}: Volatility index {volatility_index}% exceeds threshold.")
|
||||||
|
print(f"\033[91mExcluding {symbol}: Volatility index {volatility_index}% exceeds threshold.\033[0m") # Red text
|
||||||
|
continue # Remove stocks above the volatility threshold
|
||||||
|
|
||||||
|
# Append to refined list if all criteria are met
|
||||||
|
refined_list.append({
|
||||||
|
'symbol': symbol,
|
||||||
|
'share_price': share_price,
|
||||||
|
'volatility_index': volatility_index
|
||||||
|
})
|
||||||
|
self.logger.debug(f"Including {symbol}: Meets all criteria.")
|
||||||
|
print(f"\033[92mIncluding {symbol}: Meets all criteria.\033[0m") # Green text
|
||||||
|
|
||||||
|
# Conditional refinement based on config
|
||||||
|
conditional_refinement = self.config.getboolean('Module2', 'conditional_refinement_enabled', fallback=False)
|
||||||
|
if conditional_refinement:
|
||||||
|
max_list_size = self.config.getint('Module2', 'max_refined_list_size', fallback=100)
|
||||||
|
if len(refined_list) > max_list_size:
|
||||||
|
refined_list = refined_list[:max_list_size]
|
||||||
|
self.logger.info(f"List truncated to {max_list_size} items based on conditional refinement.")
|
||||||
|
print(f"\033[93mList truncated to {max_list_size} items based on conditional refinement.\033[0m") # Yellow text
|
||||||
|
|
||||||
|
self.logger.info(f"Refined list contains {len(refined_list)} stocks after applying all filters.")
|
||||||
|
print(f"\033[94mRefined list contains {len(refined_list)} stocks after applying all filters.\033[0m") # Blue text
|
||||||
|
return refined_list
|
||||||
|
|
||||||
|
def get_share_price(self, symbol):
|
||||||
|
"""
|
||||||
|
Retrieves the current share price for a given symbol.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol (str): Stock symbol.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
float: Current share price or None if unavailable.
|
||||||
|
"""
|
||||||
|
contract = Contract()
|
||||||
|
contract.symbol = symbol
|
||||||
|
contract.secType = "STK"
|
||||||
|
contract.exchange = "SMART"
|
||||||
|
contract.currency = "USD"
|
||||||
|
|
||||||
|
price = None
|
||||||
|
price_event = threading.Event()
|
||||||
|
|
||||||
|
def tickPrice_override(reqId, tickType, price_value, attrib):
|
||||||
|
nonlocal price
|
||||||
|
if tickType == TickTypeEnum.LAST:
|
||||||
|
price = price_value
|
||||||
|
price_event.set()
|
||||||
|
|
||||||
|
# Temporarily override the tickPrice callback
|
||||||
|
original_tickPrice = self.connected_client.wrapper.tickPrice
|
||||||
|
self.connected_client.wrapper.tickPrice = tickPrice_override
|
||||||
|
|
||||||
|
# Request market data
|
||||||
|
try:
|
||||||
|
self.connected_client.reqMktData(2001, contract, "", False, False, [])
|
||||||
|
self.logger.debug(f"Requested market data for {symbol}.")
|
||||||
|
print(f"\033[94mRequested market data for {symbol}.\033[0m") # Blue text
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Error requesting market data for {symbol}: {e}")
|
||||||
|
print(f"\033[91mError requesting market data for {symbol}: {e}\033[0m") # Red text
|
||||||
|
self.connected_client.wrapper.tickPrice = original_tickPrice
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Wait for the price to be received or timeout
|
||||||
|
if not price_event.wait(timeout=5):
|
||||||
|
self.logger.warning(f"Timeout while waiting for share price of {symbol}.")
|
||||||
|
print(f"\033[91mTimeout while waiting for share price of {symbol}.\033[0m") # Red text
|
||||||
|
else:
|
||||||
|
self.logger.debug(f"Share price for {symbol}: ${price}")
|
||||||
|
print(f"\033[92mShare price for {symbol}: ${price}\033[0m") # Green text
|
||||||
|
|
||||||
|
# Restore the original tickPrice callback
|
||||||
|
self.connected_client.wrapper.tickPrice = original_tickPrice
|
||||||
|
|
||||||
|
if price is not None:
|
||||||
|
return price
|
||||||
|
else:
|
||||||
|
self.logger.warning(f"Unable to retrieve share price for {symbol}.")
|
||||||
|
print(f"\033[91mUnable to retrieve share price for {symbol}.\033[0m") # Red text
|
||||||
|
return None
|
||||||
|
|
||||||
|
def has_option_contracts(self, symbol):
|
||||||
|
"""
|
||||||
|
Checks if option contracts are available for a given symbol.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol (str): Stock symbol.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
bool: True if options are available, False otherwise.
|
||||||
|
"""
|
||||||
|
contract = Contract()
|
||||||
|
contract.symbol = symbol
|
||||||
|
contract.secType = "OPT"
|
||||||
|
contract.exchange = "SMART"
|
||||||
|
contract.currency = "USD"
|
||||||
|
|
||||||
|
has_options = False
|
||||||
|
option_event = threading.Event()
|
||||||
|
|
||||||
|
def contractDetails_override(reqId, contractDetails):
|
||||||
|
nonlocal has_options
|
||||||
|
if contractDetails.contract.symbol == symbol:
|
||||||
|
has_options = True
|
||||||
|
option_event.set()
|
||||||
|
|
||||||
|
# Temporarily override the contractDetails callback
|
||||||
|
original_contractDetails = self.connected_client.wrapper.contractDetails
|
||||||
|
self.connected_client.wrapper.contractDetails = contractDetails_override
|
||||||
|
|
||||||
|
# Request contract details
|
||||||
|
try:
|
||||||
|
self.connected_client.reqContractDetails(3001, contract)
|
||||||
|
self.logger.debug(f"Requested contract details for options of {symbol}.")
|
||||||
|
print(f"\033[94mRequested contract details for options of {symbol}.\033[0m") # Blue text
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Error requesting contract details for {symbol}: {e}")
|
||||||
|
print(f"\033[91mError requesting contract details for {symbol}: {e}\033[0m") # Red text
|
||||||
|
self.connected_client.wrapper.contractDetails = original_contractDetails
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Wait for the callback or timeout
|
||||||
|
if not option_event.wait(timeout=5):
|
||||||
|
self.logger.warning(f"Timeout while checking options for {symbol}.")
|
||||||
|
print(f"\033[91mTimeout while checking options for {symbol}.\033[0m") # Red text
|
||||||
|
else:
|
||||||
|
self.logger.debug(f"Options availability for {symbol}: {has_options}")
|
||||||
|
print(f"\033[92mOptions availability for {symbol}: {has_options}\033[0m") # Green text
|
||||||
|
|
||||||
|
# Restore the original contractDetails callback
|
||||||
|
self.connected_client.wrapper.contractDetails = original_contractDetails
|
||||||
|
|
||||||
|
return has_options
|
||||||
|
|
||||||
|
def get_volatility_index(self, symbol):
|
||||||
|
"""
|
||||||
|
Retrieves the volatility index for a given symbol.
|
||||||
|
This is a placeholder function. Implement actual volatility retrieval as needed.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
symbol (str): Stock symbol.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
float: Volatility index or None if unavailable.
|
||||||
|
"""
|
||||||
|
# Placeholder implementation
|
||||||
|
# Replace this with actual implementation, e.g., using an external API or IB's data
|
||||||
|
mock_volatility = 25.0 # Example value
|
||||||
|
self.logger.debug(f"Volatility index for {symbol}: {mock_volatility}%")
|
||||||
|
print(f"\033[94mVolatility index for {symbol}: {mock_volatility}%\033[0m") # Blue text
|
||||||
|
return mock_volatility
|
||||||
|
|
||||||
|
def cache_refined_list(self, refined_list):
|
||||||
|
"""
|
||||||
|
Caches the refined stock list in a temporary file for transfer between modules.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
refined_list (list): Refined list of stocks.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
timestamp = int(time.time())
|
||||||
|
cache_path = os.path.join(tempfile.gettempdir(), f"refined_stock_list_{timestamp}.json")
|
||||||
|
with open(cache_path, 'w') as tmp_file:
|
||||||
|
json.dump(refined_list, tmp_file)
|
||||||
|
self.logger.info(f"Refined stock list cached at {cache_path}")
|
||||||
|
print(f"\033[92mRefined stock list cached at {cache_path}\033[0m") # Green text
|
||||||
|
except Exception as e:
|
||||||
|
self.logger.error(f"Failed to cache refined stock list: {e}")
|
||||||
|
print(f"\033[91mFailed to cache refined stock list: {e}\033[0m") # Red text
|
||||||
|
|
||||||
|
def print_refined_list(self, refined_list):
|
||||||
|
"""
|
||||||
|
Prints the refined stock list to the user.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
refined_list (list): Refined list of stocks.
|
||||||
|
"""
|
||||||
|
if not refined_list:
|
||||||
|
self.logger.error("No stocks meet the specified criteria after refinement.")
|
||||||
|
print("\033[91mNo stocks meet the specified criteria after refinement.\033[0m\n") # Red text
|
||||||
|
return
|
||||||
|
|
||||||
|
self.logger.info("Refined Stock List:")
|
||||||
|
print("\n\033[92mRefined Stock List:\033[0m") # Green text
|
||||||
|
print("--------------------")
|
||||||
|
for stock in refined_list:
|
||||||
|
print(f"Symbol: {stock['symbol']}, Share Price: ${stock['share_price']:.2f}, Volatility Index: {stock['volatility_index']}%")
|
||||||
|
print("--------------------\n")
|
||||||
|
|
||||||
2
src/MidasV1/requirements.txt
Normal file
2
src/MidasV1/requirements.txt
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
ibapi==10.32.1
|
||||||
|
psutil==6.1.0
|
||||||
183
src/MidasV1/tests/test_connection.py
Normal file
183
src/MidasV1/tests/test_connection.py
Normal file
@@ -0,0 +1,183 @@
|
|||||||
|
# test_connection.py
|
||||||
|
|
||||||
|
"""
|
||||||
|
========================================================================
|
||||||
|
# README
|
||||||
|
#
|
||||||
|
# Program: test_connection.py
|
||||||
|
#
|
||||||
|
# Description:
|
||||||
|
# This script tests the connectivity between the MidasV1 Trading Bot and the
|
||||||
|
# Interactive Brokers (IB) Gateway. It attempts to establish a connection,
|
||||||
|
# request account summaries, and verify that data is being received correctly.
|
||||||
|
# The script provides colored console outputs to indicate the status of each
|
||||||
|
# operation, enhancing user feedback and readability.
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Checks for the presence of the `ibapi` Python package and prompts installation if missing.
|
||||||
|
# - Attempts to connect to the IB Gateway using specified host, port, and client ID.
|
||||||
|
# - Requests account summary data to verify successful communication.
|
||||||
|
# - Implements colored console outputs using ANSI escape codes:
|
||||||
|
# - Green for successful operations.
|
||||||
|
# - Red for errors.
|
||||||
|
# - Yellow for warnings and informational messages.
|
||||||
|
# - Blue and Magenta for decorative separators and headers.
|
||||||
|
# - Handles graceful shutdown on interrupts and errors.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# Run the script from the command line:
|
||||||
|
# python test_connection.py
|
||||||
|
#
|
||||||
|
# The script will display messages indicating the progress and outcome of each step.
|
||||||
|
#
|
||||||
|
# Coded by: kleinpanic 2024
|
||||||
|
========================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import threading
|
||||||
|
|
||||||
|
# Check for ibapi dependency
|
||||||
|
try:
|
||||||
|
from ibapi.client import EClient
|
||||||
|
from ibapi.wrapper import EWrapper
|
||||||
|
from ibapi.utils import iswrapper
|
||||||
|
except ImportError:
|
||||||
|
print("┌───────────────────────────────────────────────────┐")
|
||||||
|
print("│ IB API Python Not Found! │")
|
||||||
|
print("└───────────────────────────────────────────────────┘")
|
||||||
|
print("\nThe 'ibapi' package is not installed. Please install it by running:")
|
||||||
|
print(" pip install ibapi")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
# ANSI color codes
|
||||||
|
GREEN = "\033[92m"
|
||||||
|
RED = "\033[91m"
|
||||||
|
YELLOW = "\033[93m"
|
||||||
|
BLUE = "\033[94m"
|
||||||
|
MAGENTA = "\033[95m"
|
||||||
|
BOLD = "\033[1m"
|
||||||
|
RESET = "\033[0m"
|
||||||
|
|
||||||
|
SEPARATOR = MAGENTA + "────────────────────────────────────────────────────" + RESET
|
||||||
|
|
||||||
|
class TestWrapper(EWrapper):
|
||||||
|
def __init__(self):
|
||||||
|
super().__init__()
|
||||||
|
self.nextValidOrderId = None
|
||||||
|
self.connected_flag = False
|
||||||
|
self.received_account_summary = False
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def error(self, reqId, errorCode, errorString, advancedOrderRejectJson=""):
|
||||||
|
# Distinguish between known "info" messages and real errors
|
||||||
|
# For example, errorCode=2104,2107,2158 are typically info:
|
||||||
|
info_codes = {2104, 2107, 2158}
|
||||||
|
if errorCode in info_codes:
|
||||||
|
# Print these in a different color to indicate they are not severe
|
||||||
|
print(f"{YELLOW}[INFO/STATUS] id={reqId}, code={errorCode}, msg={errorString}{RESET}")
|
||||||
|
else:
|
||||||
|
# True errors
|
||||||
|
print(f"{RED}[ERROR] id={reqId}, code={errorCode}, msg={errorString}{RESET}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def nextValidId(self, orderId: int):
|
||||||
|
print(f"{GREEN}[INFO] Next valid order ID: {orderId}{RESET}")
|
||||||
|
self.nextValidOrderId = orderId
|
||||||
|
self.connected_flag = True
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def accountSummary(self, reqId: int, account: str, tag: str, value: str, currency: str):
|
||||||
|
self.received_account_summary = True
|
||||||
|
print(f"{GREEN}[ACCOUNT SUMMARY] ReqId:{reqId}, Account:{account}, {tag} = {value} {currency}{RESET}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def accountSummaryEnd(self, reqId: int):
|
||||||
|
print(f"{GREEN}[ACCOUNT SUMMARY END] ReqId: {reqId}{RESET}")
|
||||||
|
|
||||||
|
@iswrapper
|
||||||
|
def managedAccounts(self, accountsList: str):
|
||||||
|
print(f"{GREEN}[INFO] Managed accounts: {accountsList}{RESET}")
|
||||||
|
|
||||||
|
|
||||||
|
class TestClient(EClient):
|
||||||
|
def __init__(self, wrapper):
|
||||||
|
super().__init__(wrapper)
|
||||||
|
|
||||||
|
|
||||||
|
class ConnectionTestApp(TestWrapper, TestClient):
|
||||||
|
def __init__(self, host: str, port: int, client_id: int):
|
||||||
|
TestWrapper.__init__(self)
|
||||||
|
TestClient.__init__(self, self)
|
||||||
|
|
||||||
|
self.host = host
|
||||||
|
self.port = port
|
||||||
|
self.client_id = client_id
|
||||||
|
|
||||||
|
def connect_and_run(self):
|
||||||
|
print(SEPARATOR)
|
||||||
|
print(f"{BOLD}{BLUE} IB Gateway Connection Test{RESET}")
|
||||||
|
print(SEPARATOR)
|
||||||
|
print(f"{BLUE}Attempting to connect to IB Gateway at {self.host}:{self.port}...{RESET}")
|
||||||
|
|
||||||
|
# Attempt connection with error handling
|
||||||
|
try:
|
||||||
|
self.connect(self.host, self.port, self.client_id)
|
||||||
|
except ConnectionRefusedError:
|
||||||
|
print(f"{RED}[ERROR] Connection refused. Is IB Gateway running?{RESET}")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Start the EClient message processing thread
|
||||||
|
thread = threading.Thread(target=self.run, daemon=True)
|
||||||
|
thread.start()
|
||||||
|
|
||||||
|
# Wait until connected or timeout
|
||||||
|
start_time = time.time()
|
||||||
|
timeout = 5 # seconds
|
||||||
|
while not self.connected_flag and (time.time() - start_time < timeout):
|
||||||
|
time.sleep(0.1)
|
||||||
|
|
||||||
|
if not self.connected_flag:
|
||||||
|
print(f"{RED}[ERROR] Connection not established within timeout.{RESET}")
|
||||||
|
self.disconnect()
|
||||||
|
print(f"{YELLOW}[WARN] No connection. Check Gateway settings and try again.{RESET}")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"{GREEN}[INFO] Connected successfully!{RESET}")
|
||||||
|
print(f"{BLUE}Requesting account summary...{RESET}")
|
||||||
|
|
||||||
|
# Request account summary to verify further communication
|
||||||
|
req_id = 1
|
||||||
|
self.reqAccountSummary(req_id, "All", "NetLiquidation,TotalCashValue,EquityWithLoanValue,BuyingPower")
|
||||||
|
|
||||||
|
# Wait a bit for responses
|
||||||
|
time.sleep(5)
|
||||||
|
self.cancelAccountSummary(req_id)
|
||||||
|
self.disconnect()
|
||||||
|
|
||||||
|
# Check if we received account summary data
|
||||||
|
if self.received_account_summary:
|
||||||
|
print(f"{GREEN}[INFO] Successfully retrieved account summary data.{RESET}")
|
||||||
|
else:
|
||||||
|
print(f"{YELLOW}[WARN] Connected but did not receive account summary data. Is the account funded or available?{RESET}")
|
||||||
|
|
||||||
|
print(f"{GREEN}[INFO] Disconnected successfully.{RESET}")
|
||||||
|
print(SEPARATOR)
|
||||||
|
print(f"{BOLD}{BLUE} Test Complete{RESET}")
|
||||||
|
print(SEPARATOR)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
host = "127.0.0.1"
|
||||||
|
port = 4002 # Paper trading port
|
||||||
|
client_id = 0 # Choose a unique client ID
|
||||||
|
|
||||||
|
# Check Python version
|
||||||
|
if sys.version_info < (3, 6):
|
||||||
|
print(f"{RED}[ERROR] Python 3.6+ is required for this script.{RESET}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
app = ConnectionTestApp(host, port, client_id)
|
||||||
|
app.connect_and_run()
|
||||||
|
|
||||||
435
src/MidasV1/tests/test_stock_retriever.log
Normal file
435
src/MidasV1/tests/test_stock_retriever.log
Normal file
@@ -0,0 +1,435 @@
|
|||||||
|
2024-12-13 02:01:32,318 - DEBUG - 139881771299920 connState: None -> 0
|
||||||
|
2024-12-13 02:01:32,318 - DEBUG - 139881771299920 connState: None -> 0
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - Connecting to 127.0.0.1:4002 w/ id:1
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - 139881771299920 connState: 0 -> 1
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - msg b'\x00\x00\x00\tv100..193'
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - REQUEST b'API\x00\x00\x00\x00\tv100..193'
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,319 - DEBUG - sendMsg: sent: 17
|
||||||
|
2024-12-13 02:01:32,320 - DEBUG - no fields
|
||||||
|
2024-12-13 02:01:32,321 - DEBUG - len 30 raw:b'\x00\x00\x00\x1a187\x0020241213 02:01:31 EST\x00'|
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - ANSWER b'\x00\x00\x00\x1a187\x0020241213 02:01:31 EST\x00'
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - read_msg: size: 26
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - size:26 msg:b'187\x0020241213 02:01:31 EST\x00' rest:b''|
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - fields (b'187', b'20241213 02:01:31 EST')
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - ANSWER Version:187 time:b'20241213 02:01:31 EST'
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - 139881771299920 connState: 1 -> 2
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - EReader thread started
|
||||||
|
2024-12-13 02:01:32,322 - INFO - sent startApi
|
||||||
|
2024-12-13 02:01:32,322 - INFO - REQUEST startApi {}
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,322 - INFO - SENDING startApi b'\x00\x00\x00\x0871\x002\x001\x00\x00'
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - sendMsg: sent: 12
|
||||||
|
2024-12-13 02:01:32,322 - INFO - ANSWER connectAck {}
|
||||||
|
2024-12-13 02:01:32,322 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - len 19 raw:b'\x00\x00\x00\x0f15\x001\x00DUE064818\x00'|
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - reader loop, recvd size 19
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - read_msg: size: 15
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - size:15 msg.size:15 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - fields (b'15', b'1', b'DUE064818')
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - In interpret(), handleInfo: wrap:<function EWrapper.managedAccounts at 0x7f38c36bd3a0> meth:None prms:OrderedDict([('self', <Parameter "self">), ('accountsList', <Parameter "accountsList: str">)])
|
||||||
|
2024-12-13 02:01:32,323 - DEBUG - field b'DUE064818'
|
||||||
|
2024-12-13 02:01:32,324 - DEBUG - arg DUE064818 type <class 'str'>
|
||||||
|
2024-12-13 02:01:32,324 - DEBUG - calling <bound method EWrapper.managedAccounts of <__main__.TestStockRetriever object at 0x7f38c349f450>> with <__main__.TestStockRetriever object at 0x7f38c349f450> ['DUE064818']
|
||||||
|
2024-12-13 02:01:32,324 - INFO - ANSWER managedAccounts {'accountsList': 'DUE064818'}
|
||||||
|
2024-12-13 02:01:32,324 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,324 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,324 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - len 227 raw:b'\x00\x00\x00\x069\x001\x001\x00\x00\x00\x0064\x002\x00-1\x002104\x00Market data farm connection is OK:usfarm\x00\x00\x00\x00\x00^4\x002\x00-1\x002107\x00HMDS data farm connection is inactive but should be available upon demand.ushmds\x00\x00\x00\x00\x0094\x002\x00-1\x002158\x00Sec-def data farm connection is OK:secdefnj\x00\x00'|
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - reader loop, recvd size 227
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - read_msg: size: 6
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - size:6 msg.size:6 msg:|b'\x00\x00\x0064\x002\x00-1\x002104\x00Market data farm connection is OK:usfarm\x00\x00\x00\x00\x00^4\x002\x00-1\x002107\x00HMDS data farm connection is inactive but should be available upon demand.ushmds\x00\x00\x00\x00\x0094\x002\x00-1\x002158\x00Sec-def data farm connection is OK:secdefnj\x00\x00'| buf:||
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - read_msg: size: 54
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - size:54 msg.size:54 msg:|b'\x00\x00\x00^4\x002\x00-1\x002107\x00HMDS data farm connection is inactive but should be available upon demand.ushmds\x00\x00\x00\x00\x0094\x002\x00-1\x002158\x00Sec-def data farm connection is OK:secdefnj\x00\x00'| buf:||
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - read_msg: size: 94
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - size:94 msg.size:94 msg:|b'\x00\x00\x0094\x002\x00-1\x002158\x00Sec-def data farm connection is OK:secdefnj\x00\x00'| buf:||
|
||||||
|
2024-12-13 02:01:32,366 - DEBUG - read_msg: size: 57
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - size:57 msg.size:57 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - fields (b'9', b'1', b'1')
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - In interpret(), handleInfo: wrap:<function EWrapper.nextValidId at 0x7f38c36bce00> meth:None prms:OrderedDict([('self', <Parameter "self">), ('orderId', <Parameter "orderId: int">)])
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - field b'1'
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - arg 1 type <class 'int'>
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - calling <bound method TestStockRetriever.nextValidId of <__main__.TestStockRetriever object at 0x7f38c349f450>> with <__main__.TestStockRetriever object at 0x7f38c349f450> [1]
|
||||||
|
2024-12-13 02:01:32,367 - INFO - NextValidId received: 1
|
||||||
|
2024-12-13 02:01:32,367 - INFO - Requesting market data for symbols...
|
||||||
|
2024-12-13 02:01:32,367 - INFO - REQUEST reqMktData {'reqId': 1001, 'contract': 139881771406224: 0,AAPL,STK,,,0,,,SMART,,USD,,,False,,,,combo:, 'genericTickList': '', 'snapshot': False, 'regulatorySnapshot': False, 'mktDataOptions': []}
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,367 - INFO - SENDING reqMktData b'\x00\x00\x0011\x0011\x001001\x000\x00AAPL\x00STK\x00\x000.0\x00\x00\x00SMART\x00\x00USD\x00\x00\x000\x00\x000\x000\x00\x00'
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - sendMsg: sent: 53
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - Requested market data for AAPL with ReqId: 1001
|
||||||
|
2024-12-13 02:01:32,367 - INFO - REQUEST reqMktData {'reqId': 1002, 'contract': 139881771409296: 0,MSFT,STK,,,0,,,SMART,,USD,,,False,,,,combo:, 'genericTickList': '', 'snapshot': False, 'regulatorySnapshot': False, 'mktDataOptions': []}
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,367 - INFO - SENDING reqMktData b'\x00\x00\x0011\x0011\x001002\x000\x00MSFT\x00STK\x00\x000.0\x00\x00\x00SMART\x00\x00USD\x00\x00\x000\x00\x000\x000\x00\x00'
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,367 - DEBUG - sendMsg: sent: 53
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - Requested market data for MSFT with ReqId: 1002
|
||||||
|
2024-12-13 02:01:32,368 - INFO - REQUEST reqMktData {'reqId': 1003, 'contract': 139881771409424: 0,GOOGL,STK,,,0,,,SMART,,USD,,,False,,,,combo:, 'genericTickList': '', 'snapshot': False, 'regulatorySnapshot': False, 'mktDataOptions': []}
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,368 - INFO - SENDING reqMktData b'\x00\x00\x0021\x0011\x001003\x000\x00GOOGL\x00STK\x00\x000.0\x00\x00\x00SMART\x00\x00USD\x00\x00\x000\x00\x000\x000\x00\x00'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - sendMsg: sent: 54
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - Requested market data for GOOGL with ReqId: 1003
|
||||||
|
2024-12-13 02:01:32,368 - INFO - REQUEST reqMktData {'reqId': 1004, 'contract': 139881771409360: 0,AMZN,STK,,,0,,,SMART,,USD,,,False,,,,combo:, 'genericTickList': '', 'snapshot': False, 'regulatorySnapshot': False, 'mktDataOptions': []}
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,368 - INFO - SENDING reqMktData b'\x00\x00\x0011\x0011\x001004\x000\x00AMZN\x00STK\x00\x000.0\x00\x00\x00SMART\x00\x00USD\x00\x00\x000\x00\x000\x000\x00\x00'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - sendMsg: sent: 53
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - Requested market data for AMZN with ReqId: 1004
|
||||||
|
2024-12-13 02:01:32,368 - INFO - REQUEST reqMktData {'reqId': 1005, 'contract': 139881771409552: 0,TSLA,STK,,,0,,,SMART,,USD,,,False,,,,combo:, 'genericTickList': '', 'snapshot': False, 'regulatorySnapshot': False, 'mktDataOptions': []}
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,368 - INFO - SENDING reqMktData b'\x00\x00\x0011\x0011\x001005\x000\x00TSLA\x00STK\x00\x000.0\x00\x00\x00SMART\x00\x00USD\x00\x00\x000\x00\x000\x000\x00\x00'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquiring lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - acquired lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - releasing lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - release lock
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - sendMsg: sent: 53
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - Requested market data for TSLA with ReqId: 1005
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - conn:1 queue.sz:3
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - fields (b'4', b'2', b'-1', b'2104', b'Market data farm connection is OK:usfarm', b'')
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - decode <class 'int'> b'-1'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - decode <class 'int'> b'2104'
|
||||||
|
2024-12-13 02:01:32,368 - DEBUG - decode <class 'str'> b'Market data farm connection is OK:usfarm'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,369 - ERROR - Error. ReqId: -1, Code: 2104, Msg: Market data farm connection is OK:usfarm, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - conn:1 queue.sz:2
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - fields (b'4', b'2', b'-1', b'2107', b'HMDS data farm connection is inactive but should be available upon demand.ushmds', b'')
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'-1'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'2107'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'str'> b'HMDS data farm connection is inactive but should be available upon demand.ushmds'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,369 - ERROR - Error. ReqId: -1, Code: 2107, Msg: HMDS data farm connection is inactive but should be available upon demand.ushmds, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - conn:1 queue.sz:1
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - fields (b'4', b'2', b'-1', b'2158', b'Sec-def data farm connection is OK:secdefnj', b'')
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'-1'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'int'> b'2158'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'str'> b'Sec-def data farm connection is OK:secdefnj'
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,369 - ERROR - Error. ReqId: -1, Code: 2158, Msg: Sec-def data farm connection is OK:secdefnj, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,369 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,433 - DEBUG - len 203 raw:b"\x00\x00\x00\xc74\x002\x001001\x0010089\x00Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AAPL NASDAQ.NMS/TOP/ALL\x00\x00"|
|
||||||
|
2024-12-13 02:01:32,433 - DEBUG - reader loop, recvd size 203
|
||||||
|
2024-12-13 02:01:32,433 - DEBUG - read_msg: size: 199
|
||||||
|
2024-12-13 02:01:32,433 - DEBUG - size:199 msg.size:199 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - fields (b'4', b'2', b'1001', b'10089', b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AAPL NASDAQ.NMS/TOP/ALL", b'')
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - decode <class 'int'> b'1001'
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - decode <class 'int'> b'10089'
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - decode <class 'str'> b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AAPL NASDAQ.NMS/TOP/ALL"
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,434 - ERROR - Error. ReqId: 1001, Code: 10089, Msg: Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AAPL NASDAQ.NMS/TOP/ALL, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,434 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,634 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:32,634 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,634 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,634 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - len 203 raw:b"\x00\x00\x00\xc74\x002\x001002\x0010089\x00Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.MSFT NASDAQ.NMS/TOP/ALL\x00\x00"|
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - reader loop, recvd size 203
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - read_msg: size: 199
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - size:199 msg.size:199 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - fields (b'4', b'2', b'1002', b'10089', b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.MSFT NASDAQ.NMS/TOP/ALL", b'')
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - decode <class 'int'> b'1002'
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - decode <class 'int'> b'10089'
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - decode <class 'str'> b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.MSFT NASDAQ.NMS/TOP/ALL"
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,685 - ERROR - Error. ReqId: 1002, Code: 10089, Msg: Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.MSFT NASDAQ.NMS/TOP/ALL, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,685 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - len 204 raw:b"\x00\x00\x00\xc84\x002\x001003\x0010089\x00Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.GOOGL NASDAQ.NMS/TOP/ALL\x00\x00"|
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - reader loop, recvd size 204
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - read_msg: size: 200
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - size:200 msg.size:200 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - fields (b'4', b'2', b'1003', b'10089', b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.GOOGL NASDAQ.NMS/TOP/ALL", b'')
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - decode <class 'int'> b'1003'
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - decode <class 'int'> b'10089'
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - decode <class 'str'> b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.GOOGL NASDAQ.NMS/TOP/ALL"
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,710 - ERROR - Error. ReqId: 1003, Code: 10089, Msg: Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.GOOGL NASDAQ.NMS/TOP/ALL, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,710 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,711 - DEBUG - len 203 raw:b"\x00\x00\x00\xc74\x002\x001004\x0010089\x00Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AMZN NASDAQ.NMS/TOP/ALL\x00\x00"|
|
||||||
|
2024-12-13 02:01:32,711 - DEBUG - reader loop, recvd size 203
|
||||||
|
2024-12-13 02:01:32,711 - DEBUG - read_msg: size: 199
|
||||||
|
2024-12-13 02:01:32,711 - DEBUG - size:199 msg.size:199 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,711 - DEBUG - fields (b'4', b'2', b'1004', b'10089', b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AMZN NASDAQ.NMS/TOP/ALL", b'')
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - decode <class 'int'> b'1004'
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - decode <class 'int'> b'10089'
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - decode <class 'str'> b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AMZN NASDAQ.NMS/TOP/ALL"
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,712 - ERROR - Error. ReqId: 1004, Code: 10089, Msg: Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.AMZN NASDAQ.NMS/TOP/ALL, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,712 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,912 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:32,912 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,912 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,912 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - len 203 raw:b"\x00\x00\x00\xc74\x002\x001005\x0010089\x00Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.TSLA NASDAQ.NMS/TOP/ALL\x00\x00"|
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - reader loop, recvd size 203
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - read_msg: size: 199
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - size:199 msg.size:199 msg:|b''| buf:||
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - fields (b'4', b'2', b'1005', b'10089', b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.TSLA NASDAQ.NMS/TOP/ALL", b'')
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - decode <class 'int'> b'2'
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - decode <class 'int'> b'1005'
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - decode <class 'int'> b'10089'
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - decode <class 'str'> b"Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.TSLA NASDAQ.NMS/TOP/ALL"
|
||||||
|
2024-12-13 02:01:32,934 - DEBUG - decode <class 'str'> b''
|
||||||
|
2024-12-13 02:01:32,934 - ERROR - Error. ReqId: 1005, Code: 10089, Msg: Requested market data requires additional subscription for API. See link in 'Market Data Connections' dialog for more details.Delayed market data is available.TSLA NASDAQ.NMS/TOP/ALL, Advanced Order Reject JSON:
|
||||||
|
2024-12-13 02:01:32,935 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:32,935 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:32,935 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,135 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:33,135 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,135 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:33,135 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,335 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:33,335 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,335 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:33,335 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,536 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:33,536 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,536 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:33,536 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,736 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:33,736 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,736 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:33,736 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,935 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba040>)
|
||||||
|
2024-12-13 02:01:33,935 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:33,936 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:33,936 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:33,936 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:33,936 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,137 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:34,137 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,137 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:34,137 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,337 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:34,337 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,337 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:34,337 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,537 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:34,538 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,538 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:34,538 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,738 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:34,738 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,738 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:34,738 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,937 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34b9ec0>)
|
||||||
|
2024-12-13 02:01:34,937 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:34,938 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:34,938 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:34,938 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:34,938 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,139 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:35,139 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,139 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:35,139 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,339 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:35,339 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,339 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:35,339 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,539 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:35,539 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,539 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:35,539 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,740 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:35,740 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,740 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:35,740 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,938 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba1c0>)
|
||||||
|
2024-12-13 02:01:35,938 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:35,940 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:35,940 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:35,940 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:35,940 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,141 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:36,141 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,141 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:36,141 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,341 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:36,341 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,341 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:36,341 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,541 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:36,541 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,541 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:36,541 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,742 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:36,742 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,742 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:36,742 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,939 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba040>)
|
||||||
|
2024-12-13 02:01:36,939 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:36,942 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:36,942 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:36,942 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:36,942 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,143 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:37,143 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,143 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:37,143 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,343 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:37,343 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,343 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:37,343 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,543 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:37,543 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,543 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:37,543 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,744 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:37,744 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,744 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:37,744 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,941 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34b9ec0>)
|
||||||
|
2024-12-13 02:01:37,941 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:37,944 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:37,944 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:37,944 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:37,944 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,144 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:38,144 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,144 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:38,145 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,345 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:38,345 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,345 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:38,345 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,545 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:38,545 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,545 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:38,545 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,746 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:38,746 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,746 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:38,746 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,942 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba1c0>)
|
||||||
|
2024-12-13 02:01:38,942 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:38,946 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:38,946 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:38,946 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:38,946 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,146 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:39,146 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,146 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:39,146 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,346 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:39,347 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,347 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:39,347 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,547 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:39,547 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,547 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:39,547 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,747 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:39,747 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,747 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:39,747 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,942 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba040>)
|
||||||
|
2024-12-13 02:01:39,942 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:39,948 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:39,948 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:39,948 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:39,948 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,148 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:40,148 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,148 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:40,148 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,348 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:40,348 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,348 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:40,348 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,549 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:40,549 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,549 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:40,549 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,749 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:40,749 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,749 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:40,749 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,943 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34b9ec0>)
|
||||||
|
2024-12-13 02:01:40,943 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:40,949 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:40,949 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:40,949 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:40,949 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,150 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:41,150 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,150 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:41,150 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,350 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:41,350 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,350 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:41,350 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,550 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:41,550 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,550 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:41,550 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,751 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:41,751 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,751 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:41,751 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,945 - DEBUG - socket timeout from recvMsg (<class 'TimeoutError'>, TimeoutError('timed out'), <traceback object at 0x7f38c34ba1c0>)
|
||||||
|
2024-12-13 02:01:41,945 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:41,951 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:41,951 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:41,951 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:41,951 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:42,152 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:42,152 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:42,152 - DEBUG - conn:1 queue.sz:0
|
||||||
|
2024-12-13 02:01:42,152 - DEBUG - 139881771299920 isConn: 2, connConnected: True
|
||||||
|
2024-12-13 02:01:42,322 - WARNING - Timeout while waiting for market data.
|
||||||
|
2024-12-13 02:01:42,322 - DEBUG - 139881771299920 connState: 2 -> 0
|
||||||
|
2024-12-13 02:01:42,322 - INFO - disconnecting
|
||||||
|
2024-12-13 02:01:42,322 - DEBUG - disconnecting
|
||||||
|
2024-12-13 02:01:42,322 - DEBUG - disconnected
|
||||||
|
2024-12-13 02:01:42,322 - INFO - Connection to IB Gateway closed.
|
||||||
|
2024-12-13 02:01:42,323 - DEBUG - 139881771299920 connState: None -> 0
|
||||||
|
2024-12-13 02:01:42,352 - DEBUG - queue.get: empty
|
||||||
|
2024-12-13 02:01:42,352 - DEBUG - 139881771299920 isConn: 0, connConnected: None
|
||||||
|
2024-12-13 02:01:42,352 - DEBUG - conn:0 queue.sz:0
|
||||||
|
2024-12-13 02:01:42,352 - DEBUG - 139881771299920 isConn: 0, connConnected: None
|
||||||
|
2024-12-13 02:01:42,352 - DEBUG - 139881771299920 connState: 0 -> 0
|
||||||
|
2024-12-13 02:01:42,946 - DEBUG - socket broken, disconnecting
|
||||||
|
2024-12-13 02:01:42,946 - DEBUG - reader loop, recvd size 0
|
||||||
|
2024-12-13 02:01:42,946 - DEBUG - EReader thread finished
|
||||||
|
2024-12-13 02:01:44,323 - INFO - TestStockRetriever completed.
|
||||||
238
src/MidasV1/tests/test_stock_retriever.py
Normal file
238
src/MidasV1/tests/test_stock_retriever.py
Normal file
@@ -0,0 +1,238 @@
|
|||||||
|
# test_stock_retriever.py
|
||||||
|
|
||||||
|
"""
|
||||||
|
========================================================================
|
||||||
|
# README
|
||||||
|
#
|
||||||
|
# Program: test_stock_retriever.py
|
||||||
|
#
|
||||||
|
# Description:
|
||||||
|
# This script serves as a testing tool for the MidasV1 Trading Bot.
|
||||||
|
# It connects to the Interactive Brokers (IB) Gateway, retrieves real-time
|
||||||
|
# market data for a predefined list of stock symbols, and applies internal
|
||||||
|
# criteria to filter the stocks based on share price and trading volume.
|
||||||
|
#
|
||||||
|
# Features:
|
||||||
|
# - Connects to IB Gateway on a specified host and port.
|
||||||
|
# - Requests market data (last price and volume) for a list of stock symbols.
|
||||||
|
# - Applies filtering criteria to identify stocks that meet minimum share price
|
||||||
|
# and trading volume requirements.
|
||||||
|
# - Provides colored console outputs for better readability and user feedback.
|
||||||
|
# - Handles graceful shutdown on interrupt signals and error conditions.
|
||||||
|
# - Implements robust error handling to manage API connection issues and data retrieval problems.
|
||||||
|
#
|
||||||
|
# Usage:
|
||||||
|
# Run the script from the command line:
|
||||||
|
# python test_stock_retriever.py
|
||||||
|
#
|
||||||
|
# The script will attempt to connect to IB Gateway, request market data, and display
|
||||||
|
# the list of stocks that meet the specified criteria. If no stocks meet the criteria,
|
||||||
|
# it will notify the user accordingly.
|
||||||
|
#
|
||||||
|
# Enhancements:
|
||||||
|
# - Added colored outputs using ANSI escape codes (green for pass, red for fail).
|
||||||
|
# - Implemented signal handling for graceful exits on interrupts.
|
||||||
|
# - Enhanced error handling for various API and connection errors.
|
||||||
|
# - Improved logging for detailed traceability.
|
||||||
|
#
|
||||||
|
# Coded by: kleinpanic 2024
|
||||||
|
========================================================================
|
||||||
|
"""
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import threading
|
||||||
|
import time
|
||||||
|
import signal
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from ibapi.client import EClient
|
||||||
|
from ibapi.wrapper import EWrapper
|
||||||
|
from ibapi.contract import Contract
|
||||||
|
from ibapi.ticktype import TickTypeEnum
|
||||||
|
|
||||||
|
# ANSI color codes for enhanced console outputs
|
||||||
|
GREEN = "\033[92m"
|
||||||
|
RED = "\033[91m"
|
||||||
|
YELLOW = "\033[93m"
|
||||||
|
BLUE = "\033[94m"
|
||||||
|
MAGENTA = "\033[95m"
|
||||||
|
RESET = "\033[0m"
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(
|
||||||
|
filename='test_stock_retriever.log',
|
||||||
|
filemode='w',
|
||||||
|
level=logging.DEBUG,
|
||||||
|
format='%(asctime)s - %(levelname)s - %(message)s'
|
||||||
|
)
|
||||||
|
|
||||||
|
class TestStockRetriever(EWrapper, EClient):
|
||||||
|
def __init__(self, symbols, criteria):
|
||||||
|
EClient.__init__(self, self)
|
||||||
|
self.symbols = symbols
|
||||||
|
self.criteria = criteria
|
||||||
|
self.stock_data = {}
|
||||||
|
self.data_event = threading.Event()
|
||||||
|
self.logger = logging.getLogger('MidasV1.TestStockRetriever')
|
||||||
|
|
||||||
|
def error(self, reqId, errorCode, errorString, advancedOrderRejectJson=""):
|
||||||
|
logging.error(f"Error. ReqId: {reqId}, Code: {errorCode}, Msg: {errorString}, Advanced Order Reject JSON: {advancedOrderRejectJson}")
|
||||||
|
print(f"{RED}Error. ReqId: {reqId}, Code: {errorCode}, Msg: {errorString}, Advanced Order Reject JSON: {advancedOrderRejectJson}{RESET}")
|
||||||
|
|
||||||
|
def tickPrice(self, reqId, tickType, price, attrib):
|
||||||
|
# We are interested in LAST price
|
||||||
|
if tickType == TickTypeEnum.LAST:
|
||||||
|
symbol = self.symbols.get(reqId, None)
|
||||||
|
if symbol:
|
||||||
|
self.stock_data.setdefault(symbol, {})['last_price'] = price
|
||||||
|
logging.debug(f"TickPrice. ReqId: {reqId}, Symbol: {symbol}, Last Price: {price}")
|
||||||
|
# Check if both last_price and volume are received
|
||||||
|
if 'volume' in self.stock_data[symbol]:
|
||||||
|
self.data_event.set()
|
||||||
|
print(f"{GREEN}Received last price for {symbol}: ${price}{RESET}")
|
||||||
|
|
||||||
|
def tickSize(self, reqId, tickType, size):
|
||||||
|
# We are interested in VOLUME
|
||||||
|
if tickType == TickTypeEnum.VOLUME:
|
||||||
|
symbol = self.symbols.get(reqId, None)
|
||||||
|
if symbol:
|
||||||
|
self.stock_data.setdefault(symbol, {})['volume'] = size
|
||||||
|
logging.debug(f"TickSize. ReqId: {reqId}, Symbol: {symbol}, Volume: {size}")
|
||||||
|
# Check if both last_price and volume are received
|
||||||
|
if 'last_price' in self.stock_data[symbol]:
|
||||||
|
self.data_event.set()
|
||||||
|
print(f"{GREEN}Received volume for {symbol}: {size}{RESET}")
|
||||||
|
|
||||||
|
def tickString(self, reqId, tickType, value):
|
||||||
|
# Optionally handle other tick types
|
||||||
|
pass
|
||||||
|
|
||||||
|
def tickGeneric(self, reqId, tickType, value):
|
||||||
|
# Optionally handle other tick types
|
||||||
|
pass
|
||||||
|
|
||||||
|
def nextValidId(self, orderId):
|
||||||
|
# Start requesting market data once the next valid order ID is received
|
||||||
|
logging.info(f"NextValidId received: {orderId}")
|
||||||
|
self.request_market_data()
|
||||||
|
|
||||||
|
def request_market_data(self):
|
||||||
|
logging.info("Requesting market data for symbols...")
|
||||||
|
print(f"{BLUE}Requesting market data for symbols...{RESET}")
|
||||||
|
for reqId, symbol in self.symbols.items():
|
||||||
|
contract = self.create_contract(symbol)
|
||||||
|
self.reqMktData(reqId, contract, "", False, False, [])
|
||||||
|
logging.debug(f"Requested market data for {symbol} with ReqId: {reqId}")
|
||||||
|
print(f"{BLUE}Requested market data for {symbol} with ReqId: {reqId}{RESET}")
|
||||||
|
|
||||||
|
def create_contract(self, symbol):
|
||||||
|
contract = Contract()
|
||||||
|
contract.symbol = symbol
|
||||||
|
contract.secType = "STK"
|
||||||
|
contract.exchange = "SMART"
|
||||||
|
contract.currency = "USD"
|
||||||
|
return contract
|
||||||
|
|
||||||
|
def marketDataType(self, reqId, marketDataType):
|
||||||
|
# Optionally handle different market data types
|
||||||
|
logging.debug(f"MarketDataType. ReqId: {reqId}, Type: {marketDataType}")
|
||||||
|
|
||||||
|
def connectionClosed(self):
|
||||||
|
logging.info("Connection to IB Gateway closed.")
|
||||||
|
print(f"{YELLOW}Connection to IB Gateway closed.{RESET}")
|
||||||
|
|
||||||
|
def run_retriever(self):
|
||||||
|
# Start the socket in a thread
|
||||||
|
api_thread = threading.Thread(target=self.run, daemon=True)
|
||||||
|
api_thread.start()
|
||||||
|
|
||||||
|
# Wait for data to be collected or timeout after 10 seconds
|
||||||
|
if self.data_event.wait(timeout=10):
|
||||||
|
logging.info("Market data retrieved successfully.")
|
||||||
|
print(f"{GREEN}Market data retrieved successfully.{RESET}")
|
||||||
|
else:
|
||||||
|
logging.warning("Timeout while waiting for market data.")
|
||||||
|
print(f"{RED}Timeout while waiting for market data.{RESET}")
|
||||||
|
|
||||||
|
# Disconnect after data retrieval
|
||||||
|
self.disconnect()
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
self.disconnect()
|
||||||
|
|
||||||
|
def signal_handler(sig, frame, app):
|
||||||
|
"""
|
||||||
|
Handles incoming signals for graceful shutdown.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sig (int): Signal number.
|
||||||
|
frame: Current stack frame.
|
||||||
|
app (TestStockRetriever): The running application instance.
|
||||||
|
"""
|
||||||
|
logger = logging.getLogger('MidasV1.TestStockRetriever')
|
||||||
|
logger.error("Interrupt received. Shutting down gracefully...")
|
||||||
|
print(f"{RED}Interrupt received. Shutting down gracefully...{RESET}")
|
||||||
|
app.disconnect()
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
def main():
|
||||||
|
# Define a list of stock symbols to retrieve data for
|
||||||
|
symbols_to_test = ['AAPL', 'MSFT', 'GOOGL', 'AMZN', 'TSLA']
|
||||||
|
|
||||||
|
# Assign a unique ReqId to each symbol
|
||||||
|
symbols = {1001 + idx: symbol for idx, symbol in enumerate(symbols_to_test)}
|
||||||
|
|
||||||
|
# Define internal criteria for filtering stocks
|
||||||
|
criteria = {
|
||||||
|
'min_share_price': 50.0, # Minimum share price in USD
|
||||||
|
'min_volume': 1000000 # Minimum trading volume
|
||||||
|
}
|
||||||
|
|
||||||
|
# Instantiate the TestStockRetriever
|
||||||
|
app = TestStockRetriever(symbols, criteria)
|
||||||
|
|
||||||
|
# Register the signal handler for graceful shutdown
|
||||||
|
signal.signal(signal.SIGINT, lambda sig, frame: signal_handler(sig, frame, app))
|
||||||
|
signal.signal(signal.SIGTERM, lambda sig, frame: signal_handler(sig, frame, app))
|
||||||
|
|
||||||
|
# Connect to IB Gateway or TWS
|
||||||
|
# Update the port and clientId based on your setup
|
||||||
|
app.connect("127.0.0.1", 4002, clientId=1) # Use 4002 for IB Gateway Simulated Trading
|
||||||
|
|
||||||
|
# Start the data retrieval process
|
||||||
|
app.run_retriever()
|
||||||
|
|
||||||
|
# Allow some time for data to be processed
|
||||||
|
time.sleep(2)
|
||||||
|
|
||||||
|
# Apply criteria to filter stocks
|
||||||
|
filtered_stocks = []
|
||||||
|
for symbol, data in app.stock_data.items():
|
||||||
|
last_price = data.get('last_price', 0)
|
||||||
|
volume = data.get('volume', 0)
|
||||||
|
if last_price >= criteria['min_share_price'] and volume >= criteria['min_volume']:
|
||||||
|
filtered_stocks.append({
|
||||||
|
'symbol': symbol,
|
||||||
|
'last_price': last_price,
|
||||||
|
'volume': volume
|
||||||
|
})
|
||||||
|
logging.info(f"Stock {symbol} meets criteria: Price=${last_price}, Volume={volume}")
|
||||||
|
else:
|
||||||
|
logging.info(f"Stock {symbol} does not meet criteria: Price=${last_price}, Volume={volume}")
|
||||||
|
|
||||||
|
# Display the filtered stocks
|
||||||
|
if filtered_stocks:
|
||||||
|
print(f"{GREEN}\nStocks meeting the criteria:{RESET}")
|
||||||
|
print("-----------------------------")
|
||||||
|
for stock in filtered_stocks:
|
||||||
|
print(f"Symbol: {stock['symbol']}, Last Price: ${stock['last_price']:.2f}, Volume: {stock['volume']}")
|
||||||
|
print("-----------------------------\n")
|
||||||
|
else:
|
||||||
|
print(f"{RED}\nNo stocks met the specified criteria.\n{RESET}")
|
||||||
|
|
||||||
|
# Log the end of the test
|
||||||
|
logging.info("TestStockRetriever completed.")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
|
|
||||||
0
src/WebScraper/README.md
Normal file
0
src/WebScraper/README.md
Normal file
16
src/WebScraper/assets/oil_key_words.txt
Normal file
16
src/WebScraper/assets/oil_key_words.txt
Normal file
@@ -0,0 +1,16 @@
|
|||||||
|
oil 5
|
||||||
|
profit 4
|
||||||
|
price 3
|
||||||
|
gas 4
|
||||||
|
energy 5
|
||||||
|
production 3
|
||||||
|
demand 2
|
||||||
|
supply 2
|
||||||
|
barrel 3
|
||||||
|
economy 4
|
||||||
|
investment 3
|
||||||
|
revenue 4
|
||||||
|
loss 2
|
||||||
|
rise 5
|
||||||
|
decline 1
|
||||||
|
|
||||||
2471
src/WebScraper/data/oil_news.json
Normal file
2471
src/WebScraper/data/oil_news.json
Normal file
File diff suppressed because it is too large
Load Diff
4003
src/WebScraper/data/preprocessed_oil_news.json
Normal file
4003
src/WebScraper/data/preprocessed_oil_news.json
Normal file
File diff suppressed because it is too large
Load Diff
47
src/WebScraper/main.py
Normal file
47
src/WebScraper/main.py
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
import time
|
||||||
|
import scrapers.oil_news_scraper as oil_news
|
||||||
|
import scrapers.oil_news_preprocessor as oil_news_preprocessor
|
||||||
|
from tqdm import tqdm
|
||||||
|
|
||||||
|
def show_usage_bar(duration):
|
||||||
|
for _ in tqdm(range(duration), desc="Processing", unit="sec"):
|
||||||
|
time.sleep(1)
|
||||||
|
|
||||||
|
def run_scraper():
|
||||||
|
print("Starting oil data collection with the scraper...")
|
||||||
|
show_usage_bar(0) # Simulated progress bar duration
|
||||||
|
oil_news.run_scraper()
|
||||||
|
print("Oil news data scraping completed.")
|
||||||
|
|
||||||
|
def run_preprocessor():
|
||||||
|
print("Starting oil data collection with the preprocessor...")
|
||||||
|
show_usage_bar(0) # Simulated progress bar duration
|
||||||
|
oil_news_preprocessor.run_preprocessor()
|
||||||
|
print("Oil news data preprocessing completed.")
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Oil News Data Collection Tool"
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--scraper", action="store_true", help="Run the oil news scraper (original code)."
|
||||||
|
)
|
||||||
|
parser.add_argument(
|
||||||
|
"--preprocessed", action="store_true", help="Run the oil news preprocessor (new code for sentiment analysis)."
|
||||||
|
)
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
if args.scraper:
|
||||||
|
run_scraper()
|
||||||
|
elif args.preprocessed:
|
||||||
|
run_preprocessor()
|
||||||
|
else:
|
||||||
|
print("No valid option selected. Use '--scraper' to run the scraper or '--preprocessed' to run the preprocessor.")
|
||||||
|
parser.print_help()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
|
|
||||||
30
src/WebScraper/requirements.txt
Normal file
30
src/WebScraper/requirements.txt
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
attrs==24.2.0
|
||||||
|
beautifulsoup4==4.12.3
|
||||||
|
certifi==2024.8.30
|
||||||
|
charset-normalizer==3.4.0
|
||||||
|
flake8==7.1.1
|
||||||
|
h11==0.14.0
|
||||||
|
idna==3.10
|
||||||
|
mccabe==0.7.0
|
||||||
|
numpy==2.1.2
|
||||||
|
outcome==1.3.0.post0
|
||||||
|
pandas==2.2.3
|
||||||
|
pycodestyle==2.12.1
|
||||||
|
pyflakes==3.2.0
|
||||||
|
PySocks==1.7.1
|
||||||
|
python-dateutil==2.9.0.post0
|
||||||
|
pytz==2024.2
|
||||||
|
requests==2.32.3
|
||||||
|
selenium==4.25.0
|
||||||
|
six==1.16.0
|
||||||
|
sniffio==1.3.1
|
||||||
|
sortedcontainers==2.4.0
|
||||||
|
soupsieve==2.6
|
||||||
|
tqdm==4.66.6
|
||||||
|
trio==0.27.0
|
||||||
|
trio-websocket==0.11.1
|
||||||
|
typing_extensions==4.12.2
|
||||||
|
tzdata==2024.2
|
||||||
|
urllib3==2.2.3
|
||||||
|
websocket-client==1.8.0
|
||||||
|
wsproto==1.2.0
|
||||||
60
src/WebScraper/retrievers/ibkr_data_retriever.py
Normal file
60
src/WebScraper/retrievers/ibkr_data_retriever.py
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
from ibapi.client import EClient
|
||||||
|
from ibapi.wrapper import EWrapper
|
||||||
|
from ibapi.contract import Contract
|
||||||
|
|
||||||
|
class IBKRDataRetriever(EWrapper, EClient):
|
||||||
|
def __init__(self):
|
||||||
|
EClient.__init__(self, self)
|
||||||
|
|
||||||
|
def connect_and_retrieve_data(self):
|
||||||
|
self.connect("127.0.0.1", 7497, clientId=0) # Ensure IB Gateway or TWS is running
|
||||||
|
contract = Contract()
|
||||||
|
contract.symbol = "AAPL" # Example stock; replace as needed
|
||||||
|
contract.secType = "STK"
|
||||||
|
contract.exchange = "SMART"
|
||||||
|
contract.currency = "USD"
|
||||||
|
|
||||||
|
self.reqHistoricalData(
|
||||||
|
reqId=1,
|
||||||
|
contract=contract,
|
||||||
|
endDateTime=datetime.now().strftime("%Y%m%d %H:%M:%S"),
|
||||||
|
durationStr="1 D",
|
||||||
|
barSizeSetting="1 day",
|
||||||
|
whatToShow="MIDPOINT",
|
||||||
|
useRTH=1,
|
||||||
|
formatDate=1,
|
||||||
|
keepUpToDate=False,
|
||||||
|
chartOptions=[]
|
||||||
|
)
|
||||||
|
|
||||||
|
def historicalData(self, reqId, bar):
|
||||||
|
data = {
|
||||||
|
"Date": bar.date,
|
||||||
|
"Close/Last": bar.close,
|
||||||
|
"Volume": bar.volume,
|
||||||
|
"Open": bar.open,
|
||||||
|
"High": bar.high,
|
||||||
|
"Low": bar.low
|
||||||
|
}
|
||||||
|
self.save_data_to_json(data)
|
||||||
|
|
||||||
|
def save_data_to_json(self, data):
|
||||||
|
json_path = "../data/HistoricalData.json"
|
||||||
|
try:
|
||||||
|
with open(json_path, "r") as file:
|
||||||
|
historical_data = json.load(file)
|
||||||
|
except FileNotFoundError:
|
||||||
|
historical_data = []
|
||||||
|
|
||||||
|
historical_data.insert(0, data)
|
||||||
|
|
||||||
|
with open(json_path, "w") as file:
|
||||||
|
json.dump(historical_data, file, indent=4)
|
||||||
|
print(f"Data saved to {json_path}")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
app = IBKRDataRetriever()
|
||||||
|
app.connect_and_retrieve_data()
|
||||||
|
|
||||||
Binary file not shown.
Binary file not shown.
251
src/WebScraper/scrapers/oil_news_preprocessor.py
Normal file
251
src/WebScraper/scrapers/oil_news_preprocessor.py
Normal file
@@ -0,0 +1,251 @@
|
|||||||
|
import json
|
||||||
|
import re
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
from selenium import webdriver
|
||||||
|
from selenium.webdriver.firefox.options import Options
|
||||||
|
from selenium.webdriver.common.by import By
|
||||||
|
from selenium.webdriver.support.ui import WebDriverWait
|
||||||
|
from selenium.webdriver.support import expected_conditions as EC
|
||||||
|
from bs4 import BeautifulSoup
|
||||||
|
from tqdm import tqdm # Progress bar
|
||||||
|
|
||||||
|
OIL_NEWS_URL = "https://oilprice.com/Latest-Energy-News/World-News/"
|
||||||
|
SCRAPER_DIR = os.path.dirname(os.path.dirname(__file__)) # One level up
|
||||||
|
DATA_DIR = os.path.join(SCRAPER_DIR, "data")
|
||||||
|
KEYWORD_FILE_PATH = os.path.join(SCRAPER_DIR, "assets", "oil_key_words.txt")
|
||||||
|
|
||||||
|
if not os.path.exists(DATA_DIR):
|
||||||
|
os.makedirs(DATA_DIR)
|
||||||
|
|
||||||
|
def load_existing_data(file_path):
|
||||||
|
if os.path.exists(file_path):
|
||||||
|
with open(file_path, 'r', encoding='utf-8') as f:
|
||||||
|
return json.load(f)
|
||||||
|
return []
|
||||||
|
|
||||||
|
def save_to_json(data, file_path):
|
||||||
|
existing_data = load_existing_data(file_path)
|
||||||
|
existing_links = {article['link'] for article in existing_data if 'link' in article}
|
||||||
|
|
||||||
|
new_data = []
|
||||||
|
for article in data:
|
||||||
|
if 'link' not in article or article['link'] in existing_links:
|
||||||
|
print(f"Skipping duplicate or missing link article: {article.get('headline', 'Unknown Headline')}")
|
||||||
|
continue
|
||||||
|
new_data.append(article)
|
||||||
|
|
||||||
|
combined_data = existing_data + new_data
|
||||||
|
|
||||||
|
with open(file_path, 'w', encoding='utf-8') as f:
|
||||||
|
json.dump(combined_data, f, ensure_ascii=False, indent=4)
|
||||||
|
print(f"Data saved to {file_path}")
|
||||||
|
|
||||||
|
def load_keyword_importance(file_path):
|
||||||
|
keyword_importance = {}
|
||||||
|
if os.path.exists(file_path):
|
||||||
|
with open(file_path, 'r', encoding='utf-8') as f:
|
||||||
|
for line in f:
|
||||||
|
parts = line.strip().split()
|
||||||
|
if len(parts) == 2:
|
||||||
|
keyword, importance = parts
|
||||||
|
keyword_importance[keyword.lower()] = int(importance)
|
||||||
|
else:
|
||||||
|
print(f"Keyword file not found at {file_path}")
|
||||||
|
return keyword_importance
|
||||||
|
|
||||||
|
keyword_importance = load_keyword_importance(KEYWORD_FILE_PATH)
|
||||||
|
|
||||||
|
def extract_keywords(text, keyword_importance):
|
||||||
|
words = re.findall(r'\b\w+\b', text.lower())
|
||||||
|
keywords = {word: keyword_importance[word] for word in words if word in keyword_importance}
|
||||||
|
return sorted(keywords.items(), key=lambda x: x[1], reverse=True)[:10]
|
||||||
|
|
||||||
|
def filter_content(content):
|
||||||
|
"""Remove advertisements, irrelevant phrases, headers, and disclaimers from content."""
|
||||||
|
patterns = [
|
||||||
|
r'ADVERTISEMENT',
|
||||||
|
r'Click Here for \d+\+ Global Oil Prices',
|
||||||
|
r'Find us on:',
|
||||||
|
r'Back to homepage',
|
||||||
|
r'Join the discussion',
|
||||||
|
r'More Top Reads From Oilprice.com',
|
||||||
|
r'©OilPrice\.com.*?educational purposes',
|
||||||
|
r'A Media Solutions.*?Oilprice.com',
|
||||||
|
r'\"It\'s most important 8 minute read of my week…\"',
|
||||||
|
r'^[\w\s]*?is a [\w\s]*? for Oilprice\.com.*?More Info',
|
||||||
|
r'^.*?DNOW is a supplier.*?,',
|
||||||
|
]
|
||||||
|
|
||||||
|
for pattern in patterns:
|
||||||
|
content = re.sub(pattern, '', content, flags=re.IGNORECASE)
|
||||||
|
content = re.sub(r'\s+', ' ', content).strip()
|
||||||
|
return content
|
||||||
|
|
||||||
|
def scrape_author_info(driver, author_url, headline_pages=1):
|
||||||
|
"""Scrape author's name, bio, contributor since date, and latest article headlines with excerpts, keywords, and timestamp."""
|
||||||
|
author_name = "Unknown"
|
||||||
|
author_bio = ""
|
||||||
|
contributor_since = ""
|
||||||
|
other_articles = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Load author page
|
||||||
|
driver.get(author_url)
|
||||||
|
WebDriverWait(driver, 15).until(
|
||||||
|
EC.presence_of_element_located((By.TAG_NAME, "h1"))
|
||||||
|
)
|
||||||
|
page_source = driver.page_source
|
||||||
|
bio_soup = BeautifulSoup(page_source, "html.parser")
|
||||||
|
|
||||||
|
# Extract author name
|
||||||
|
author_name_tag = bio_soup.find('h1')
|
||||||
|
author_name = author_name_tag.get_text(strip=True) if author_name_tag else "Unknown Author"
|
||||||
|
|
||||||
|
# Extract author bio
|
||||||
|
author_bio_tag = bio_soup.find('div', class_='biography')
|
||||||
|
author_bio = author_bio_tag.get_text(strip=True) if author_bio_tag else "No bio available"
|
||||||
|
|
||||||
|
# Extract contributor since date
|
||||||
|
contributor_since_tag = bio_soup.find('p', class_='contributor_since')
|
||||||
|
contributor_since = contributor_since_tag.get_text(strip=True).replace("Contributor since: ", "") if contributor_since_tag else "Unknown Date"
|
||||||
|
|
||||||
|
# Extract latest articles by author with heading, excerpt, keywords, and timestamp
|
||||||
|
for page in range(1, headline_pages + 1):
|
||||||
|
driver.get(f"{author_url}/Page-{page}.html")
|
||||||
|
WebDriverWait(driver, 10).until(
|
||||||
|
EC.presence_of_element_located((By.CLASS_NAME, "articles"))
|
||||||
|
)
|
||||||
|
page_soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
article_tags = page_soup.find_all('li', class_='clear')
|
||||||
|
|
||||||
|
for article in article_tags:
|
||||||
|
heading_tag = article.find('h3')
|
||||||
|
excerpt_tag = article.find('p', class_='articlecontent')
|
||||||
|
timestamp_tag = article.find('div', class_='meta')
|
||||||
|
|
||||||
|
if heading_tag and excerpt_tag and timestamp_tag:
|
||||||
|
heading = heading_tag.get_text(strip=True)
|
||||||
|
excerpt = filter_content(excerpt_tag.get_text(strip=True)) # Use filter_content
|
||||||
|
timestamp = timestamp_tag.get_text(strip=True).split("|")[0].replace("Published ", "").strip()
|
||||||
|
keywords = [keyword for keyword, _ in extract_keywords(excerpt, keyword_importance)]
|
||||||
|
|
||||||
|
other_articles.append({
|
||||||
|
"heading": heading,
|
||||||
|
"excerpt": excerpt,
|
||||||
|
"keywords": keywords,
|
||||||
|
"published_date": timestamp
|
||||||
|
})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error scraping author info: {e}")
|
||||||
|
author_name = "Error Occurred"
|
||||||
|
author_bio = str(e)
|
||||||
|
contributor_since = "N/A"
|
||||||
|
other_articles = [{"heading": "Error retrieving articles", "excerpt": "", "keywords": [], "published_date": ""}]
|
||||||
|
|
||||||
|
return {
|
||||||
|
"name": author_name,
|
||||||
|
"bio": author_bio,
|
||||||
|
"contributor_since": contributor_since,
|
||||||
|
"other_articles": other_articles
|
||||||
|
}
|
||||||
|
|
||||||
|
def scrape_oil_news():
|
||||||
|
print("Scraping oil news articles for sentiment analysis...")
|
||||||
|
|
||||||
|
options = Options()
|
||||||
|
options.headless = True
|
||||||
|
driver = webdriver.Firefox(options=options)
|
||||||
|
|
||||||
|
news_data = []
|
||||||
|
page_number = 1
|
||||||
|
max_pages = 1
|
||||||
|
total_articles = 0
|
||||||
|
|
||||||
|
while page_number <= max_pages:
|
||||||
|
driver.get(f"{OIL_NEWS_URL}Page-{page_number}.html")
|
||||||
|
try:
|
||||||
|
WebDriverWait(driver, 10).until(
|
||||||
|
EC.presence_of_element_located((By.CLASS_NAME, "categoryArticle"))
|
||||||
|
)
|
||||||
|
except:
|
||||||
|
break
|
||||||
|
soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
total_articles += len(soup.find_all('div', class_='categoryArticle'))
|
||||||
|
page_number += 1
|
||||||
|
|
||||||
|
page_number = 1
|
||||||
|
with tqdm(total=total_articles, desc="Scraping articles", unit="article") as pbar:
|
||||||
|
while page_number <= max_pages:
|
||||||
|
print(f"\nProcessing page {page_number}...")
|
||||||
|
driver.get(f"{OIL_NEWS_URL}Page-{page_number}.html")
|
||||||
|
soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
articles = soup.find_all('div', class_='categoryArticle')
|
||||||
|
if not articles:
|
||||||
|
break
|
||||||
|
|
||||||
|
for article in articles:
|
||||||
|
headline = article.find('h2', class_='categoryArticle__title').get_text(strip=True) if article.find('h2', class_='categoryArticle__title') else None
|
||||||
|
link_tag = article.find('a', href=True)
|
||||||
|
link = link_tag['href'] if link_tag else None
|
||||||
|
date_meta = article.find('p', class_='categoryArticle__meta')
|
||||||
|
date = date_meta.get_text(strip=True).split('|')[0].strip() if date_meta else None
|
||||||
|
|
||||||
|
content = ""
|
||||||
|
if link:
|
||||||
|
print(f"Fetching article: {link}")
|
||||||
|
driver.get(link)
|
||||||
|
try:
|
||||||
|
WebDriverWait(driver, 10).until(
|
||||||
|
EC.presence_of_element_located((By.CLASS_NAME, "singleArticle"))
|
||||||
|
)
|
||||||
|
article_soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
raw_content = " ".join([p.get_text(strip=True) for p in article_soup.find_all('p')])
|
||||||
|
content = filter_content(raw_content)
|
||||||
|
|
||||||
|
# Fetch author info using scrape_author_info
|
||||||
|
author_url = article_soup.find('a', text=re.compile(r'More Info|Read More', re.IGNORECASE))['href']
|
||||||
|
author_info = scrape_author_info(driver, author_url, headline_pages=1)
|
||||||
|
|
||||||
|
except:
|
||||||
|
print(f"Error: Content did not load for article {headline}.")
|
||||||
|
author_info = {
|
||||||
|
"name": "Unknown",
|
||||||
|
"bio": "",
|
||||||
|
"contributor_since": "",
|
||||||
|
"other_articles": []
|
||||||
|
}
|
||||||
|
|
||||||
|
extracted_keywords = extract_keywords(f"{headline} {content}", keyword_importance)
|
||||||
|
|
||||||
|
if headline and link and date:
|
||||||
|
news_data.append({
|
||||||
|
'headline': headline,
|
||||||
|
'link': link,
|
||||||
|
'content': content,
|
||||||
|
'date': date,
|
||||||
|
'author': author_info['name'],
|
||||||
|
'author_bio': author_info['bio'],
|
||||||
|
'contributor_since': author_info['contributor_since'],
|
||||||
|
'other_articles': author_info['other_articles'],
|
||||||
|
'keywords': extracted_keywords,
|
||||||
|
})
|
||||||
|
|
||||||
|
pbar.set_postfix_str(f"Processing article: {headline[:40]}...")
|
||||||
|
pbar.update(1)
|
||||||
|
|
||||||
|
page_number += 1
|
||||||
|
time.sleep(2)
|
||||||
|
|
||||||
|
driver.quit()
|
||||||
|
return news_data
|
||||||
|
|
||||||
|
def run_preprocessor():
|
||||||
|
file_path = os.path.join(DATA_DIR, 'preprocessed_oil_news.json')
|
||||||
|
news_data = scrape_oil_news()
|
||||||
|
save_to_json(news_data, file_path)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run_preprocessor()
|
||||||
|
|
||||||
143
src/WebScraper/scrapers/oil_news_scraper.py
Normal file
143
src/WebScraper/scrapers/oil_news_scraper.py
Normal file
@@ -0,0 +1,143 @@
|
|||||||
|
import json
|
||||||
|
from selenium import webdriver
|
||||||
|
from selenium.webdriver.firefox.options import Options
|
||||||
|
from selenium.webdriver.common.by import By
|
||||||
|
from selenium.webdriver.support.ui import WebDriverWait
|
||||||
|
from selenium.webdriver.support import expected_conditions as EC
|
||||||
|
from bs4 import BeautifulSoup
|
||||||
|
import os
|
||||||
|
import time
|
||||||
|
import re
|
||||||
|
|
||||||
|
OIL_NEWS_URL = "https://oilprice.com/Latest-Energy-News/World-News/"
|
||||||
|
DATA_DIR = os.path.join(os.getcwd(), "data")
|
||||||
|
KEYWORD_FILE_PATH = os.path.join(os.getcwd(), "assets", "oil_key_words.txt")
|
||||||
|
|
||||||
|
if not os.path.exists(DATA_DIR):
|
||||||
|
os.makedirs(DATA_DIR)
|
||||||
|
|
||||||
|
def load_existing_data(file_path):
|
||||||
|
"""Load existing data from JSON file to avoid duplicates."""
|
||||||
|
if os.path.exists(file_path):
|
||||||
|
with open(file_path, 'r', encoding='utf-8') as f:
|
||||||
|
return json.load(f)
|
||||||
|
return []
|
||||||
|
|
||||||
|
def save_to_json(data, file_path):
|
||||||
|
"""Save scraped data to a JSON file, ensuring no duplicates."""
|
||||||
|
existing_data = load_existing_data(file_path)
|
||||||
|
existing_links = {article['link'] for article in existing_data}
|
||||||
|
|
||||||
|
new_data = []
|
||||||
|
for article in data:
|
||||||
|
if article['link'] in existing_links:
|
||||||
|
print(f"Skipping duplicate article: {article['headline']}")
|
||||||
|
continue
|
||||||
|
new_data.append(article)
|
||||||
|
|
||||||
|
combined_data = existing_data + new_data
|
||||||
|
|
||||||
|
with open(file_path, 'w', encoding='utf-8') as f:
|
||||||
|
json.dump(combined_data, f, ensure_ascii=False, indent=4)
|
||||||
|
print(f"Oil news data saved to {file_path}")
|
||||||
|
|
||||||
|
def load_keyword_importance(file_path):
|
||||||
|
"""Load keyword importance values from the oil_key_words.txt file."""
|
||||||
|
keyword_importance = {}
|
||||||
|
if os.path.exists(file_path):
|
||||||
|
with open(file_path, 'r', encoding='utf-8') as f:
|
||||||
|
for line in f:
|
||||||
|
parts = line.strip().split()
|
||||||
|
if len(parts) == 2:
|
||||||
|
keyword, importance = parts
|
||||||
|
keyword_importance[keyword.lower()] = int(importance)
|
||||||
|
else:
|
||||||
|
print(f"Keyword file not found at {file_path}")
|
||||||
|
return keyword_importance
|
||||||
|
|
||||||
|
keyword_importance = load_keyword_importance(KEYWORD_FILE_PATH)
|
||||||
|
|
||||||
|
def extract_keywords(text, keyword_importance):
|
||||||
|
"""Extract important keywords from text based on an external keyword list."""
|
||||||
|
words = re.findall(r'\b\w+\b', text.lower())
|
||||||
|
keywords = {}
|
||||||
|
|
||||||
|
for word in words:
|
||||||
|
if len(word) > 3 and word in keyword_importance:
|
||||||
|
keywords[word] = keyword_importance[word] # Store keyword with its importance
|
||||||
|
|
||||||
|
# Return up to 10 unique keywords with their importance
|
||||||
|
return sorted(keywords.items(), key=lambda x: x[1], reverse=True)[:10]
|
||||||
|
|
||||||
|
def analyze_sentiment(text):
|
||||||
|
"""Basic sentiment analysis placeholder with minimal processing."""
|
||||||
|
# Only check for specific keywords; avoid complex logic to save time
|
||||||
|
if "profit" in text or "rise" in text:
|
||||||
|
return "Positive"
|
||||||
|
elif "loss" in text or "decline" in text:
|
||||||
|
return "Negative"
|
||||||
|
else:
|
||||||
|
return "Neutral"
|
||||||
|
|
||||||
|
def scrape_oil_news():
|
||||||
|
print("Scraping oil market news using Selenium...")
|
||||||
|
|
||||||
|
options = Options()
|
||||||
|
options.headless = True
|
||||||
|
driver = webdriver.Firefox(options=options)
|
||||||
|
|
||||||
|
news_data = []
|
||||||
|
page_number = 1
|
||||||
|
max_pages = 10 # Limit to 10 pages
|
||||||
|
|
||||||
|
while page_number <= max_pages:
|
||||||
|
print(f"Processing page {page_number}...")
|
||||||
|
driver.get(f"{OIL_NEWS_URL}Page-{page_number}.html")
|
||||||
|
|
||||||
|
try:
|
||||||
|
WebDriverWait(driver, 20).until(
|
||||||
|
EC.presence_of_element_located((By.CLASS_NAME, "categoryArticle"))
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error: Content did not load properly on page {page_number}.")
|
||||||
|
break
|
||||||
|
|
||||||
|
soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
|
||||||
|
articles = soup.find_all('div', class_='categoryArticle')
|
||||||
|
if not articles:
|
||||||
|
print(f"No articles found on page {page_number}. Ending pagination.")
|
||||||
|
break
|
||||||
|
|
||||||
|
for article in articles:
|
||||||
|
headline = article.find('h2', class_='categoryArticle__title').get_text(strip=True) if article.find('h2', class_='categoryArticle__title') else None
|
||||||
|
link = article.find('a', href=True)['href'] if article.find('a', href=True) else None
|
||||||
|
date = article.find('p', class_='categoryArticle__meta').get_text(strip=True) if article.find('p', class_='categoryArticle__meta') else None
|
||||||
|
excerpt = article.find('p', class_='categoryArticle__excerpt').get_text(strip=True) if article.find('p', class_='categoryArticle__excerpt') else None
|
||||||
|
author = date.split('|')[-1].strip() if '|' in date else "Unknown Author"
|
||||||
|
timestamp = date.split('|')[0].strip() if '|' in date else date
|
||||||
|
extracted_keywords = extract_keywords(headline + " " + excerpt if excerpt else headline, keyword_importance)
|
||||||
|
|
||||||
|
if headline and link and date:
|
||||||
|
news_data.append({
|
||||||
|
'headline': headline,
|
||||||
|
'link': link,
|
||||||
|
'date': timestamp,
|
||||||
|
'author': author,
|
||||||
|
'excerpt': excerpt,
|
||||||
|
'keywords': extracted_keywords,
|
||||||
|
'sentiment_analysis': None
|
||||||
|
#'sentiment_analysis': analyze_sentiment(headline + " " + excerpt if excerpt else headline)
|
||||||
|
})
|
||||||
|
|
||||||
|
page_number += 1
|
||||||
|
time.sleep(2)
|
||||||
|
|
||||||
|
driver.quit()
|
||||||
|
return news_data
|
||||||
|
|
||||||
|
def run_scraper():
|
||||||
|
file_path = os.path.join(DATA_DIR, 'oil_news.json')
|
||||||
|
news_data = scrape_oil_news()
|
||||||
|
save_to_json(news_data, file_path)
|
||||||
|
|
||||||
347
src/WebScraper/scrapers/tests/author_info.json
Normal file
347
src/WebScraper/scrapers/tests/author_info.json
Normal file
@@ -0,0 +1,347 @@
|
|||||||
|
{
|
||||||
|
"name": "Charles Kennedy",
|
||||||
|
"bio": "Charles is a writer for Oilprice.com",
|
||||||
|
"contributor_since": "29 Sep 2011",
|
||||||
|
"other_articles": [
|
||||||
|
{
|
||||||
|
"heading": "Record Shale Production Helps ConocoPhillips Beat Profit Estimates",
|
||||||
|
"excerpt": "ConocoPhillips (NYSE: COP) is raising its ordinary dividend and share buyback program as its third-quarter earnings beat market expectations on the back of higher total…",
|
||||||
|
"keywords": [
|
||||||
|
"share",
|
||||||
|
"market",
|
||||||
|
"higher",
|
||||||
|
"back",
|
||||||
|
"total",
|
||||||
|
"expectations",
|
||||||
|
"third",
|
||||||
|
"beat",
|
||||||
|
"raising",
|
||||||
|
"conocophillips"
|
||||||
|
],
|
||||||
|
"published_date": "31 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Rosneft to Resume Output at Idled Black Sea Refinery in November",
|
||||||
|
"excerpt": "Rosneft plans to resume crude processing at its Tuapse oil refinery on Russia’s Black Sea coast in November, after idling it for a month because…",
|
||||||
|
"keywords": [
|
||||||
|
"processing",
|
||||||
|
"idling",
|
||||||
|
"russia",
|
||||||
|
"plans",
|
||||||
|
"rosneft",
|
||||||
|
"refinery",
|
||||||
|
"tuapse",
|
||||||
|
"crude",
|
||||||
|
"november",
|
||||||
|
"black"
|
||||||
|
],
|
||||||
|
"published_date": "31 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Canadian Natural Resources Q3 Profit Slips as Oil and Gas Prices Fall",
|
||||||
|
"excerpt": "Canada’s largest oil and gas producer, Canadian Natural Resources (NYSE: CNQ), reported lower adjusted net earnings from operations for the third quarter compared to a…",
|
||||||
|
"keywords": [
|
||||||
|
"canada",
|
||||||
|
"operations",
|
||||||
|
"producer",
|
||||||
|
"resources",
|
||||||
|
"reported",
|
||||||
|
"canadian",
|
||||||
|
"largest",
|
||||||
|
"third",
|
||||||
|
"natural",
|
||||||
|
"nyse"
|
||||||
|
],
|
||||||
|
"published_date": "31 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Exelon Reports 80% Surge in Data Center Power Supply Deals",
|
||||||
|
"excerpt": "Exelon has seen an 80% increase in power supply deals coming from data enter operators in the latest sign that the IT industry is driving…",
|
||||||
|
"keywords": [
|
||||||
|
"industry",
|
||||||
|
"data",
|
||||||
|
"driving",
|
||||||
|
"seen",
|
||||||
|
"power",
|
||||||
|
"increase",
|
||||||
|
"exelon",
|
||||||
|
"deals",
|
||||||
|
"sign",
|
||||||
|
"that"
|
||||||
|
],
|
||||||
|
"published_date": "31 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Russia’s Gazprom Boosts 2024 Investments to $16.9 Billion",
|
||||||
|
"excerpt": "Gazprom is raising its investment plan for 2024 by 4% to $16.9 billion (1.642 trillion Russian rubles), thanks to rising exports and domestic supply, the…",
|
||||||
|
"keywords": [
|
||||||
|
"investment",
|
||||||
|
"russian",
|
||||||
|
"rubles",
|
||||||
|
"plan",
|
||||||
|
"exports",
|
||||||
|
"billion",
|
||||||
|
"raising",
|
||||||
|
"thanks",
|
||||||
|
"trillion",
|
||||||
|
"supply"
|
||||||
|
],
|
||||||
|
"published_date": "30 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Investment Giants Form $50-Billion AI and Power Partnership",
|
||||||
|
"excerpt": "Global investment firm KKR and private-equity giant Energy Capital Partners on Wednesday announced a $50 billion strategic partnership to invest in data centers and power…",
|
||||||
|
"keywords": [
|
||||||
|
"centers",
|
||||||
|
"strategic",
|
||||||
|
"investment",
|
||||||
|
"giant",
|
||||||
|
"energy",
|
||||||
|
"capital",
|
||||||
|
"private",
|
||||||
|
"wednesday",
|
||||||
|
"billion",
|
||||||
|
"data"
|
||||||
|
],
|
||||||
|
"published_date": "30 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Vietnamese EV Maker Gets $1 Billion in Funding Led by UAE",
|
||||||
|
"excerpt": "Vietnam’s electric vehicle manufacturer VinFast Auto is expected to receive at least $1 billion in overseas funding led by Emirates Driving Company (EDC), Abu Dhabi’s…",
|
||||||
|
"keywords": [
|
||||||
|
"overseas",
|
||||||
|
"manufacturer",
|
||||||
|
"vietnam",
|
||||||
|
"expected",
|
||||||
|
"billion",
|
||||||
|
"driving",
|
||||||
|
"emirates",
|
||||||
|
"funding",
|
||||||
|
"receive",
|
||||||
|
"least"
|
||||||
|
],
|
||||||
|
"published_date": "30 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Chinese Oil Major to Explore Iraqi Field",
|
||||||
|
"excerpt": "China’s CNOOC has inked a deal for exploration at an oil field in central Iraq, the company said today.\nThe deposit, Block 7, will be…",
|
||||||
|
"keywords": [
|
||||||
|
"deposit",
|
||||||
|
"cnooc",
|
||||||
|
"iraq",
|
||||||
|
"field",
|
||||||
|
"central",
|
||||||
|
"deal",
|
||||||
|
"today",
|
||||||
|
"said",
|
||||||
|
"china",
|
||||||
|
"inked"
|
||||||
|
],
|
||||||
|
"published_date": "30 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "TotalEnergies to Produce More Gas Condensate Offshore Denmark",
|
||||||
|
"excerpt": "U.S. refining and chemicals giant Phillips 66 (NYSE: PSX) booked higher-than-expected earnings for the third quarter even if earnings plunged from a year earlier, as…",
|
||||||
|
"keywords": [
|
||||||
|
"phillips",
|
||||||
|
"refining",
|
||||||
|
"giant",
|
||||||
|
"than",
|
||||||
|
"expected",
|
||||||
|
"higher",
|
||||||
|
"year",
|
||||||
|
"plunged",
|
||||||
|
"third",
|
||||||
|
"even"
|
||||||
|
],
|
||||||
|
"published_date": "29 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Phillips 66 Beats Analyst Estimates Despite Earnings Dip in Q3",
|
||||||
|
"excerpt": "U.S. refining and chemicals giant Phillips 66 (NYSE: PSX) booked higher-than-expected earnings for the third quarter even if earnings plunged from a year earlier, as…",
|
||||||
|
"keywords": [
|
||||||
|
"phillips",
|
||||||
|
"refining",
|
||||||
|
"giant",
|
||||||
|
"than",
|
||||||
|
"expected",
|
||||||
|
"higher",
|
||||||
|
"year",
|
||||||
|
"plunged",
|
||||||
|
"third",
|
||||||
|
"even"
|
||||||
|
],
|
||||||
|
"published_date": "29 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "UK Offshore Oil Platform Halted Due to Gas Compressor Issue",
|
||||||
|
"excerpt": "Production via the Triton Floating Production Storage & Offloading (FPSO) vessel in the UK North Sea has been halted due to a problem with the…",
|
||||||
|
"keywords": [
|
||||||
|
"fpso",
|
||||||
|
"been",
|
||||||
|
"with",
|
||||||
|
"problem",
|
||||||
|
"halted",
|
||||||
|
"storage",
|
||||||
|
"triton",
|
||||||
|
"vessel",
|
||||||
|
"offloading",
|
||||||
|
"north"
|
||||||
|
],
|
||||||
|
"published_date": "29 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "UAE’s Renewable Energy Giant Pushes Back Green Hydrogen Targets",
|
||||||
|
"excerpt": "Masdar, the clean energy giant of the United Arab Emirates (UAE), has pushed back its target to reach 1 million tons per year of green…",
|
||||||
|
"keywords": [
|
||||||
|
"united",
|
||||||
|
"energy",
|
||||||
|
"giant",
|
||||||
|
"emirates",
|
||||||
|
"back",
|
||||||
|
"year",
|
||||||
|
"million",
|
||||||
|
"arab",
|
||||||
|
"pushed",
|
||||||
|
"target"
|
||||||
|
],
|
||||||
|
"published_date": "28 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Profit at India’s Top Refiner Slumps by 99% Due to Weak Margins",
|
||||||
|
"excerpt": "IndianOil, the biggest refiner in India, reported on Monday a net profit tumbling by 98.6% in the quarter to September from a year ago amid…",
|
||||||
|
"keywords": [
|
||||||
|
"refiner",
|
||||||
|
"monday",
|
||||||
|
"september",
|
||||||
|
"biggest",
|
||||||
|
"reported",
|
||||||
|
"indianoil",
|
||||||
|
"india",
|
||||||
|
"year",
|
||||||
|
"tumbling",
|
||||||
|
"profit"
|
||||||
|
],
|
||||||
|
"published_date": "28 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Average U.S. Gasoline Price Set to Drop Below $3 for the First Time Since 2021",
|
||||||
|
"excerpt": "The U.S. national average price of gasoline is set to soon fall below $3 per gallon for the first time since 2021, amid lower seasonal…",
|
||||||
|
"keywords": [
|
||||||
|
"gasoline",
|
||||||
|
"national",
|
||||||
|
"below",
|
||||||
|
"gallon",
|
||||||
|
"soon",
|
||||||
|
"first",
|
||||||
|
"lower",
|
||||||
|
"average",
|
||||||
|
"seasonal",
|
||||||
|
"price"
|
||||||
|
],
|
||||||
|
"published_date": "28 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "FERC Grants Exxon and Qatar Three-Year Extension to Build Golden Pass LNG",
|
||||||
|
"excerpt": "The U.S. Federal Energy Regulatory Commission has granted a three-year extension to ExxonMobil and QatarEnergy to build their $10-billion Golden Pass LNG export plant in…",
|
||||||
|
"keywords": [
|
||||||
|
"federal",
|
||||||
|
"export",
|
||||||
|
"three",
|
||||||
|
"energy",
|
||||||
|
"golden",
|
||||||
|
"billion",
|
||||||
|
"year",
|
||||||
|
"their",
|
||||||
|
"qatarenergy",
|
||||||
|
"regulatory"
|
||||||
|
],
|
||||||
|
"published_date": "25 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Cepsa: Windfall Tax Would Delay Its $3.3-Billion Hydrogen Plan",
|
||||||
|
"excerpt": "Cepsa, Spain’s second-largest oil company, will delay its $3.25 billion (3 billion euros) investment into domestic green hydrogen projects if Spain makes the windfall tax…",
|
||||||
|
"keywords": [
|
||||||
|
"investment",
|
||||||
|
"second",
|
||||||
|
"projects",
|
||||||
|
"billion",
|
||||||
|
"euros",
|
||||||
|
"largest",
|
||||||
|
"into",
|
||||||
|
"delay",
|
||||||
|
"will",
|
||||||
|
"cepsa"
|
||||||
|
],
|
||||||
|
"published_date": "25 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "South Africa Seeks Loan Guarantees for Energy Transition Funding",
|
||||||
|
"excerpt": "South Africa is currently negotiating loan guarantees with its international partners in its $9.3-billion Just Energy Transition Partnership (JETP) program for energy investment.\nThe International…",
|
||||||
|
"keywords": [
|
||||||
|
"jetp",
|
||||||
|
"negotiating",
|
||||||
|
"energy",
|
||||||
|
"transition",
|
||||||
|
"currently",
|
||||||
|
"investment",
|
||||||
|
"billion",
|
||||||
|
"south",
|
||||||
|
"africa",
|
||||||
|
"guarantees"
|
||||||
|
],
|
||||||
|
"published_date": "25 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Saudi Oil Export Revenues Hit Three-Year Low as Prices Decline",
|
||||||
|
"excerpt": "Lower crude oil prices dragged Saudi Arabia’s oil export revenues to the lowest level in more than three years in August, amid underwhelming oil demand…",
|
||||||
|
"keywords": [
|
||||||
|
"years",
|
||||||
|
"three",
|
||||||
|
"august",
|
||||||
|
"than",
|
||||||
|
"more",
|
||||||
|
"dragged",
|
||||||
|
"revenues",
|
||||||
|
"saudi",
|
||||||
|
"crude",
|
||||||
|
"prices"
|
||||||
|
],
|
||||||
|
"published_date": "24 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Tesla Stock Soars After Q3 Earnings Beat",
|
||||||
|
"excerpt": "Tesla (NASDAQ: TSLA) saw its shares jump by 20% after hours on Wednesday and another 14% in pre-market trade on Thursday after reporting earnings for…",
|
||||||
|
"keywords": [
|
||||||
|
"thursday",
|
||||||
|
"after",
|
||||||
|
"trade",
|
||||||
|
"market",
|
||||||
|
"tesla",
|
||||||
|
"wednesday",
|
||||||
|
"another",
|
||||||
|
"nasdaq",
|
||||||
|
"hours",
|
||||||
|
"reporting"
|
||||||
|
],
|
||||||
|
"published_date": "24 October 2024"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"heading": "Oil Refining Giant Valero Tops Estimates Despite Q3 Profit Plunge",
|
||||||
|
"excerpt": "One of the biggest U.S. refiners, Valero Energy (NYSE: VLO), beat Wall Street estimates even as it reported a widely expected plunge in its third-quarter…",
|
||||||
|
"keywords": [
|
||||||
|
"street",
|
||||||
|
"energy",
|
||||||
|
"biggest",
|
||||||
|
"wall",
|
||||||
|
"reported",
|
||||||
|
"expected",
|
||||||
|
"plunge",
|
||||||
|
"widely",
|
||||||
|
"third",
|
||||||
|
"valero"
|
||||||
|
],
|
||||||
|
"published_date": "24 October 2024"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
109
src/WebScraper/scrapers/tests/author_scraper_test.py
Normal file
109
src/WebScraper/scrapers/tests/author_scraper_test.py
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
import json
|
||||||
|
import re
|
||||||
|
import time
|
||||||
|
from selenium import webdriver
|
||||||
|
from selenium.webdriver.firefox.options import Options
|
||||||
|
from selenium.webdriver.common.by import By
|
||||||
|
from selenium.webdriver.support.ui import WebDriverWait
|
||||||
|
from selenium.webdriver.support import expected_conditions as EC
|
||||||
|
from bs4 import BeautifulSoup
|
||||||
|
|
||||||
|
AUTHOR_URL = "https://oilprice.com/contributors/Charles-Kennedy" # Replace with actual author URL
|
||||||
|
OUTPUT_FILE = "author_info.json"
|
||||||
|
|
||||||
|
def extract_keywords(text):
|
||||||
|
"""Basic keyword extraction by finding unique words longer than 3 characters."""
|
||||||
|
words = re.findall(r'\b\w{4,}\b', text.lower())
|
||||||
|
keywords = list(set(words))
|
||||||
|
return keywords[:10] # Limit to top 10 unique keywords for simplicity
|
||||||
|
|
||||||
|
def scrape_author_info(author_url, headline_pages=1):
|
||||||
|
"""Scrape author's name, bio, contributor since date, and latest article headlines with excerpts, keywords, and timestamp."""
|
||||||
|
options = Options()
|
||||||
|
options.headless = True
|
||||||
|
driver = webdriver.Firefox(options=options)
|
||||||
|
|
||||||
|
author_name = "Unknown"
|
||||||
|
author_bio = ""
|
||||||
|
contributor_since = ""
|
||||||
|
other_articles = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Load author page
|
||||||
|
driver.get(author_url)
|
||||||
|
WebDriverWait(driver, 15).until(
|
||||||
|
EC.presence_of_element_located((By.TAG_NAME, "h1"))
|
||||||
|
)
|
||||||
|
page_source = driver.page_source
|
||||||
|
bio_soup = BeautifulSoup(page_source, "html.parser")
|
||||||
|
|
||||||
|
# Extract author name
|
||||||
|
author_name_tag = bio_soup.find('h1')
|
||||||
|
author_name = author_name_tag.get_text(strip=True) if author_name_tag else "Unknown Author"
|
||||||
|
|
||||||
|
# Extract author bio
|
||||||
|
author_bio_tag = bio_soup.find('div', class_='biography')
|
||||||
|
author_bio = author_bio_tag.get_text(strip=True) if author_bio_tag else "No bio available"
|
||||||
|
|
||||||
|
# Extract contributor since date
|
||||||
|
contributor_since_tag = bio_soup.find('p', class_='contributor_since')
|
||||||
|
contributor_since = contributor_since_tag.get_text(strip=True).replace("Contributor since: ", "") if contributor_since_tag else "Unknown Date"
|
||||||
|
|
||||||
|
# Extract latest articles by author with heading, excerpt, keywords, and timestamp
|
||||||
|
for page in range(1, headline_pages + 1):
|
||||||
|
driver.get(f"{author_url}/Page-{page}.html")
|
||||||
|
WebDriverWait(driver, 10).until(
|
||||||
|
EC.presence_of_element_located((By.CLASS_NAME, "articles"))
|
||||||
|
)
|
||||||
|
page_soup = BeautifulSoup(driver.page_source, "html.parser")
|
||||||
|
article_tags = page_soup.find_all('li', class_='clear')
|
||||||
|
|
||||||
|
for article in article_tags:
|
||||||
|
heading_tag = article.find('h3')
|
||||||
|
excerpt_tag = article.find('p', class_='articlecontent')
|
||||||
|
timestamp_tag = article.find('div', class_='meta')
|
||||||
|
|
||||||
|
if heading_tag and excerpt_tag and timestamp_tag:
|
||||||
|
heading = heading_tag.get_text(strip=True)
|
||||||
|
excerpt = excerpt_tag.get_text(strip=True)
|
||||||
|
timestamp = timestamp_tag.get_text(strip=True).split("|")[0].replace("Published ", "").strip()
|
||||||
|
keywords = extract_keywords(excerpt)
|
||||||
|
|
||||||
|
other_articles.append({
|
||||||
|
"heading": heading,
|
||||||
|
"excerpt": excerpt,
|
||||||
|
"keywords": keywords,
|
||||||
|
"published_date": timestamp
|
||||||
|
})
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error scraping author info: {e}")
|
||||||
|
author_name = "Error Occurred"
|
||||||
|
author_bio = str(e)
|
||||||
|
contributor_since = "N/A"
|
||||||
|
other_articles = [{"heading": "Error retrieving articles", "excerpt": "", "keywords": [], "published_date": ""}]
|
||||||
|
|
||||||
|
finally:
|
||||||
|
driver.quit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
"name": author_name,
|
||||||
|
"bio": author_bio,
|
||||||
|
"contributor_since": contributor_since,
|
||||||
|
"other_articles": other_articles
|
||||||
|
}
|
||||||
|
|
||||||
|
def save_to_json(data, output_file):
|
||||||
|
"""Save author info to a JSON file."""
|
||||||
|
with open(output_file, mode="w", encoding="utf-8") as file:
|
||||||
|
json.dump(data, file, ensure_ascii=False, indent=4)
|
||||||
|
|
||||||
|
print(f"Author info saved to {output_file}")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
# Scrape author info
|
||||||
|
author_info = scrape_author_info(AUTHOR_URL, headline_pages=1)
|
||||||
|
|
||||||
|
# Save to JSON
|
||||||
|
save_to_json(author_info, OUTPUT_FILE)
|
||||||
|
|
||||||
26
src/WebScraper/scrapers/tests/selenium_webdriver_test.py
Normal file
26
src/WebScraper/scrapers/tests/selenium_webdriver_test.py
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
from selenium import webdriver
|
||||||
|
from selenium.webdriver.firefox.service import Service
|
||||||
|
from selenium.webdriver.common.by import By
|
||||||
|
import time
|
||||||
|
|
||||||
|
# Provide the path to your geckodriver executable using the Service class
|
||||||
|
service = Service(executable_path='/usr/local/bin/geckodriver')
|
||||||
|
driver = webdriver.Firefox(service=service)
|
||||||
|
|
||||||
|
# Open a website (e.g., OilPrice.com)
|
||||||
|
driver.get("https://oilprice.com/Latest-Energy-News/World-News/")
|
||||||
|
|
||||||
|
# Wait for the page to load
|
||||||
|
time.sleep(5)
|
||||||
|
|
||||||
|
# Print the title of the page to verify that it's loaded
|
||||||
|
print(driver.title)
|
||||||
|
|
||||||
|
# Find and print some element on the page, e.g., all article titles
|
||||||
|
articles = driver.find_elements(By.CSS_SELECTOR, "div.categoryArticle")
|
||||||
|
for article in articles:
|
||||||
|
title = article.find_element(By.TAG_NAME, "a").text
|
||||||
|
print(f"Article title: {title}")
|
||||||
|
|
||||||
|
# Close the browser
|
||||||
|
driver.quit()
|
||||||
0
src/WebScraper/setup.py
Normal file
0
src/WebScraper/setup.py
Normal file
184
src/griffin-stuff/API/API_1.ipynb
Normal file
184
src/griffin-stuff/API/API_1.ipynb
Normal file
@@ -0,0 +1,184 @@
|
|||||||
|
{
|
||||||
|
"cells": [
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 13,
|
||||||
|
"id": "69d88f26-f288-4a23-8be5-3e8317e23731",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [
|
||||||
|
{
|
||||||
|
"name": "stderr",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"ERROR -1 2104 Market data farm connection is OK:usfarm.nj\n",
|
||||||
|
"ERROR -1 2104 Market data farm connection is OK:usfuture\n",
|
||||||
|
"ERROR -1 2104 Market data farm connection is OK:cashfarm\n",
|
||||||
|
"ERROR -1 2104 Market data farm connection is OK:usfarm\n",
|
||||||
|
"ERROR -1 2106 HMDS data farm connection is OK:ushmds\n",
|
||||||
|
"ERROR -1 2158 Sec-def data farm connection is OK:secdefnj\n"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "stdout",
|
||||||
|
"output_type": "stream",
|
||||||
|
"text": [
|
||||||
|
"Historical Data Ended\n",
|
||||||
|
" Date Open High Low Close Volume\n",
|
||||||
|
"0 20241030 18:00:00 69.10 69.10 68.96 69.02 378\n",
|
||||||
|
"1 20241030 18:05:00 69.02 69.07 69.01 69.05 99\n",
|
||||||
|
"2 20241030 18:10:00 69.06 69.07 69.01 69.01 103\n",
|
||||||
|
"3 20241030 18:15:00 69.01 69.02 69.00 69.00 54\n",
|
||||||
|
"4 20241030 18:20:00 69.01 69.01 68.99 69.00 25\n",
|
||||||
|
"5 20241030 18:25:00 69.00 69.05 69.00 69.04 40\n",
|
||||||
|
"6 20241030 18:30:00 69.05 69.05 69.03 69.03 63\n",
|
||||||
|
"7 20241030 18:35:00 69.03 69.03 69.00 69.00 64\n",
|
||||||
|
"8 20241030 18:40:00 68.99 69.01 68.98 68.99 60\n",
|
||||||
|
"9 20241030 18:45:00 68.99 68.99 68.95 68.97 66\n",
|
||||||
|
"10 20241030 18:50:00 68.97 69.00 68.96 68.99 44\n",
|
||||||
|
"11 20241030 18:55:00 68.98 68.98 68.97 68.98 23\n",
|
||||||
|
"12 20241030 19:00:00 68.98 69.02 68.98 69.01 48\n",
|
||||||
|
"13 20241030 19:05:00 69.02 69.03 69.00 69.01 31\n",
|
||||||
|
"14 20241030 19:10:00 69.02 69.02 69.00 69.00 22\n",
|
||||||
|
"15 20241030 19:15:00 69.00 69.00 68.99 68.99 11\n",
|
||||||
|
"16 20241030 19:20:00 68.99 68.99 68.95 68.95 40\n",
|
||||||
|
"17 20241030 19:25:00 68.95 68.95 68.94 68.94 55\n",
|
||||||
|
"18 20241030 19:30:00 68.94 68.96 68.93 68.95 54\n",
|
||||||
|
"19 20241030 19:35:00 68.95 68.97 68.95 68.96 29\n",
|
||||||
|
"20 20241030 19:40:00 68.96 68.98 68.96 68.98 47\n",
|
||||||
|
"21 20241030 19:45:00 68.98 68.99 68.95 68.95 65\n",
|
||||||
|
"22 20241030 19:50:00 68.96 68.98 68.96 68.97 16\n",
|
||||||
|
"23 20241030 19:55:00 68.97 68.97 68.94 68.94 35\n",
|
||||||
|
"24 20241030 20:00:00 68.95 68.99 68.91 68.92 369\n",
|
||||||
|
"25 20241030 20:05:00 68.91 68.94 68.91 68.93 74\n",
|
||||||
|
"26 20241030 20:10:00 68.93 68.95 68.89 68.94 187\n",
|
||||||
|
"27 20241030 20:15:00 68.94 68.95 68.92 68.94 81\n",
|
||||||
|
"28 20241030 20:20:00 68.95 68.97 68.94 68.96 89\n",
|
||||||
|
"29 20241030 20:25:00 68.96 68.96 68.92 68.94 96\n",
|
||||||
|
"30 20241030 20:30:00 68.94 68.98 68.93 68.96 94\n",
|
||||||
|
"31 20241030 20:35:00 68.97 68.97 68.93 68.94 66\n",
|
||||||
|
"32 20241030 20:40:00 68.95 68.95 68.93 68.94 44\n",
|
||||||
|
"33 20241030 20:45:00 68.93 68.96 68.93 68.94 98\n",
|
||||||
|
"34 20241030 20:50:00 68.94 68.94 68.92 68.92 95\n"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"source": [
|
||||||
|
"from ibapi.client import EClient\n",
|
||||||
|
"from ibapi.wrapper import EWrapper\n",
|
||||||
|
"from ibapi.contract import Contract\n",
|
||||||
|
"import threading\n",
|
||||||
|
"import time\n",
|
||||||
|
"import pandas as pd\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the IB API app\n",
|
||||||
|
"class IBApi(EWrapper, EClient):\n",
|
||||||
|
" def __init__(self):\n",
|
||||||
|
" EClient.__init__(self, self)\n",
|
||||||
|
" self.data = [] # Initialize an empty list to store data\n",
|
||||||
|
"\n",
|
||||||
|
" # Override the historicalData function to process and store incoming data\n",
|
||||||
|
" def historicalData(self, reqId, bar):\n",
|
||||||
|
" # Append the data as a dictionary to self.data\n",
|
||||||
|
" self.data.append({\n",
|
||||||
|
" \"Date\": bar.date,\n",
|
||||||
|
" \"Open\": bar.open,\n",
|
||||||
|
" \"High\": bar.high,\n",
|
||||||
|
" \"Low\": bar.low,\n",
|
||||||
|
" \"Close\": bar.close,\n",
|
||||||
|
" \"Volume\": bar.volume\n",
|
||||||
|
" })\n",
|
||||||
|
"\n",
|
||||||
|
" def historicalDataEnd(self, reqId, start, end):\n",
|
||||||
|
" print(\"Historical Data Ended\")\n",
|
||||||
|
" # Convert the data to a DataFrame when data collection is complete\n",
|
||||||
|
" self.df = pd.DataFrame(self.data)\n",
|
||||||
|
" print(self.df) # Display the DataFrame to verify\n",
|
||||||
|
" self.disconnect() # Disconnect after data collection is complete\n",
|
||||||
|
"\n",
|
||||||
|
"# Define the app handler for running in the notebook\n",
|
||||||
|
"class IBApp:\n",
|
||||||
|
" def __init__(self):\n",
|
||||||
|
" self.app = IBApi()\n",
|
||||||
|
"\n",
|
||||||
|
" def connect(self):\n",
|
||||||
|
" self.app.connect(\"127.0.0.1\", 7496, 0) # Change port if needed\n",
|
||||||
|
" thread = threading.Thread(target=self.run_app, daemon=True)\n",
|
||||||
|
" thread.start()\n",
|
||||||
|
" time.sleep(1) # Allow time for the connection to establish\n",
|
||||||
|
"\n",
|
||||||
|
" def run_app(self):\n",
|
||||||
|
" self.app.run()\n",
|
||||||
|
"\n",
|
||||||
|
" def request_oil_data(self):\n",
|
||||||
|
" # Define the contract for Crude Oil Futures\n",
|
||||||
|
" contract = Contract()\n",
|
||||||
|
" contract.symbol = \"CL\"\n",
|
||||||
|
" contract.secType = \"FUT\"\n",
|
||||||
|
" contract.exchange = \"NYMEX\"\n",
|
||||||
|
" contract.currency = \"USD\"\n",
|
||||||
|
" contract.lastTradeDateOrContractMonth = \"202412\" # Example: Dec 2024 contract\n",
|
||||||
|
"\n",
|
||||||
|
" # Request historical data\n",
|
||||||
|
" self.app.reqHistoricalData(\n",
|
||||||
|
" reqId=1,\n",
|
||||||
|
" contract=contract,\n",
|
||||||
|
" endDateTime='',\n",
|
||||||
|
" durationStr='1 D', # 1 month\n",
|
||||||
|
" barSizeSetting='5 mins',\n",
|
||||||
|
" whatToShow='TRADES',\n",
|
||||||
|
" useRTH=0,\n",
|
||||||
|
" formatDate=1,\n",
|
||||||
|
" keepUpToDate=False,\n",
|
||||||
|
" chartOptions=[]\n",
|
||||||
|
" )\n",
|
||||||
|
"\n",
|
||||||
|
" def disconnect(self):\n",
|
||||||
|
" self.app.disconnect()\n",
|
||||||
|
"\n",
|
||||||
|
"# Create an instance and connect\n",
|
||||||
|
"app = IBApp()\n",
|
||||||
|
"app.connect()\n",
|
||||||
|
"\n",
|
||||||
|
"# Request data and output to a DataFrame\n",
|
||||||
|
"app.request_oil_data()\n",
|
||||||
|
"\n",
|
||||||
|
"# Wait for data retrieval to complete\n",
|
||||||
|
"time.sleep(10)\n",
|
||||||
|
"\n",
|
||||||
|
"# Access the DataFrame\n",
|
||||||
|
"df = app.app.df if hasattr(app.app, 'df') else pd.DataFrame()"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"cell_type": "code",
|
||||||
|
"execution_count": 17,
|
||||||
|
"id": "2088c621-81d3-46f0-8596-ce05d1a89fd4",
|
||||||
|
"metadata": {},
|
||||||
|
"outputs": [],
|
||||||
|
"source": [
|
||||||
|
"data = df.to_csv()"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"metadata": {
|
||||||
|
"kernelspec": {
|
||||||
|
"display_name": "Python 3 (ipykernel)",
|
||||||
|
"language": "python",
|
||||||
|
"name": "python3"
|
||||||
|
},
|
||||||
|
"language_info": {
|
||||||
|
"codemirror_mode": {
|
||||||
|
"name": "ipython",
|
||||||
|
"version": 3
|
||||||
|
},
|
||||||
|
"file_extension": ".py",
|
||||||
|
"mimetype": "text/x-python",
|
||||||
|
"name": "python",
|
||||||
|
"nbconvert_exporter": "python",
|
||||||
|
"pygments_lexer": "ipython3",
|
||||||
|
"version": "3.12.3"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nbformat": 4,
|
||||||
|
"nbformat_minor": 5
|
||||||
|
}
|
||||||
2074
src/griffin-stuff/API/API_2 (1).ipynb
Normal file
2074
src/griffin-stuff/API/API_2 (1).ipynb
Normal file
File diff suppressed because one or more lines are too long
BIN
src/griffin-stuff/API/Trading_Bot_Development_Strategy (1).docx
Normal file
BIN
src/griffin-stuff/API/Trading_Bot_Development_Strategy (1).docx
Normal file
Binary file not shown.
246
src/griffin-stuff/GUSHTradingBotV1.0.py
Normal file
246
src/griffin-stuff/GUSHTradingBotV1.0.py
Normal file
@@ -0,0 +1,246 @@
|
|||||||
|
import numpy as np
|
||||||
|
import pandas as pd
|
||||||
|
import yfinance as yf
|
||||||
|
from scipy.optimize import minimize
|
||||||
|
|
||||||
|
|
||||||
|
def ticker_info():
|
||||||
|
ticker = "gush"
|
||||||
|
return ticker.upper()
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_expiration_dates(ticker):
|
||||||
|
print(f"Fetching available expiration dates for {ticker}...")
|
||||||
|
stock = yf.Ticker(ticker)
|
||||||
|
expiration_dates = stock.options
|
||||||
|
print(f"Available expiration dates: {expiration_dates}")
|
||||||
|
return expiration_dates
|
||||||
|
|
||||||
|
|
||||||
|
def select_expiration_date(expiration_dates):
|
||||||
|
print("Selecting the first available expiration date...")
|
||||||
|
expiration_date = expiration_dates[0]
|
||||||
|
print(f"Selected expiration date: {expiration_date}")
|
||||||
|
return expiration_date
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_option_chain(ticker, expiration_date):
|
||||||
|
print(f"Fetching option chain for {ticker} with expiration date {expiration_date}...")
|
||||||
|
stock = yf.Ticker(ticker)
|
||||||
|
options_chain = stock.option_chain(expiration_date)
|
||||||
|
print("Option chain fetched successfully!")
|
||||||
|
return options_chain
|
||||||
|
|
||||||
|
|
||||||
|
def get_price_data(ticker, start_date, end_date):
|
||||||
|
print(f"Fetching price data for {ticker} from {start_date} to {end_date}...")
|
||||||
|
data = yf.download(ticker, start=start_date, end=end_date)
|
||||||
|
print(f"Price data fetched successfully for {ticker}!")
|
||||||
|
return data
|
||||||
|
|
||||||
|
|
||||||
|
def moving_average_strategy(data, short_window=20, long_window=50):
|
||||||
|
data['Short_MA'] = data['Close'].rolling(window=short_window).mean()
|
||||||
|
data['Long_MA'] = data['Close'].rolling(window=long_window).mean()
|
||||||
|
data['Signal'] = np.where(data['Short_MA'] > data['Long_MA'], 1, -1)
|
||||||
|
return data['Signal']
|
||||||
|
|
||||||
|
def rsi_strategy(data, window=14, overbought=70, oversold=30):
|
||||||
|
delta = data['Close'].diff(1)
|
||||||
|
gain = np.where(delta > 0, delta, 0).flatten() # Flatten to 1D array
|
||||||
|
loss = np.where(delta < 0, abs(delta), 0).flatten() # Flatten to 1D array
|
||||||
|
|
||||||
|
avg_gain = pd.Series(gain).rolling(window=window).mean()
|
||||||
|
avg_loss = pd.Series(loss).rolling(window=window).mean()
|
||||||
|
|
||||||
|
# Avoid division by zero by using np.where to replace 0 with np.nan in avg_loss
|
||||||
|
rs = avg_gain / np.where(avg_loss == 0, np.nan, avg_loss)
|
||||||
|
|
||||||
|
rsi = 100 - (100 / (1 + rs))
|
||||||
|
|
||||||
|
signal = np.where(rsi < oversold, 1, np.where(rsi > overbought, -1, 0))
|
||||||
|
return pd.Series(signal, index=data.index)
|
||||||
|
|
||||||
|
def bollinger_bands_strategy(data, window=20, num_std=2):
|
||||||
|
# Calculate moving average
|
||||||
|
data['Moving_Avg'] = data['Close'].rolling(window=window).mean()
|
||||||
|
|
||||||
|
# Calculate rolling standard deviation and force it to be a Series
|
||||||
|
rolling_std = data['Close'].rolling(window).std()
|
||||||
|
rolling_std = rolling_std.squeeze() # Ensure rolling_std is a Series
|
||||||
|
|
||||||
|
# Print shapes for debugging
|
||||||
|
print(f"Shape of Moving_Avg: {data['Moving_Avg'].shape}")
|
||||||
|
print(f"Shape of Rolling Std: {rolling_std.shape}")
|
||||||
|
|
||||||
|
# Calculate upper and lower bands
|
||||||
|
data['Band_Upper'] = data['Moving_Avg'] + (num_std * rolling_std)
|
||||||
|
data['Band_Lower'] = data['Moving_Avg'] - (num_std * rolling_std)
|
||||||
|
|
||||||
|
# Print shapes after assignments for debugging
|
||||||
|
print(f"Shape of Band_Upper: {data['Band_Upper'].shape}")
|
||||||
|
print(f"Shape of Band_Lower: {data['Band_Lower'].shape}")
|
||||||
|
|
||||||
|
# Check for NaN values
|
||||||
|
print(f"NaNs in Close: {data['Close'].isna().sum()}")
|
||||||
|
print(f"NaNs in Band_Upper: {data['Band_Upper'].isna().sum()}")
|
||||||
|
print(f"NaNs in Band_Lower: {data['Band_Lower'].isna().sum()}")
|
||||||
|
|
||||||
|
# Print the columns of the DataFrame
|
||||||
|
print(f"Columns in data before dropping NaNs: {data.columns.tolist()}")
|
||||||
|
|
||||||
|
# Optionally drop rows with NaNs
|
||||||
|
data = data.dropna(subset=['Close', 'Band_Upper', 'Band_Lower'])
|
||||||
|
|
||||||
|
# Generate signals based on the bands
|
||||||
|
signal = np.where(data['Close'] < data['Band_Lower'], 1,
|
||||||
|
np.where(data['Close'] > data['Band_Upper'], -1, 0))
|
||||||
|
|
||||||
|
return pd.Series(signal, index=data.index)
|
||||||
|
|
||||||
|
def generate_signals(data):
|
||||||
|
ma_signal = moving_average_strategy(data)
|
||||||
|
rsi_signal = rsi_strategy(data)
|
||||||
|
bollinger_signal = bollinger_bands_strategy(data)
|
||||||
|
return pd.DataFrame({'MA': ma_signal, 'RSI': rsi_signal, 'Bollinger': bollinger_signal})
|
||||||
|
|
||||||
|
|
||||||
|
def backtest_option_trades(option_chain, signals, stock_data):
|
||||||
|
"""
|
||||||
|
Backtest option trades based on the given signals and stock data.
|
||||||
|
"""
|
||||||
|
trades = []
|
||||||
|
current_position = None
|
||||||
|
|
||||||
|
# Ensure both stock_data and option_chain indices are sorted in ascending order
|
||||||
|
stock_data = stock_data.sort_index()
|
||||||
|
|
||||||
|
# Convert 'lastTradeDate' or any date-related columns to datetime in option_chain
|
||||||
|
if 'lastTradeDate' in option_chain.columns:
|
||||||
|
option_chain['lastTradeDate'] = pd.to_datetime(option_chain['lastTradeDate'])
|
||||||
|
option_chain = option_chain.set_index('lastTradeDate')
|
||||||
|
|
||||||
|
# If option_chain index isn't datetime, convert it to datetime (ensuring compatibility)
|
||||||
|
option_chain.index = pd.to_datetime(option_chain.index)
|
||||||
|
|
||||||
|
# Remove the timezone from option_chain index
|
||||||
|
option_chain.index = option_chain.index.tz_localize(None)
|
||||||
|
|
||||||
|
# Now reindex the option chain to match the stock data index (forward fill missing option prices)
|
||||||
|
option_chain = option_chain.sort_index()
|
||||||
|
option_chain = option_chain.reindex(stock_data.index, method='ffill')
|
||||||
|
|
||||||
|
for i in range(len(signals)):
|
||||||
|
if signals.iloc[i]['MA'] == 1 and current_position is None:
|
||||||
|
# BUY signal
|
||||||
|
entry_price = option_chain['lastPrice'].iloc[i]
|
||||||
|
if pd.isna(entry_price): # If price is nan, log the error and continue
|
||||||
|
print(f"Missing entry price on {stock_data.index[i]}, skipping trade.")
|
||||||
|
continue
|
||||||
|
entry_date = stock_data.index[i]
|
||||||
|
current_position = {
|
||||||
|
'entry_price': entry_price,
|
||||||
|
'entry_date': entry_date
|
||||||
|
}
|
||||||
|
print(f"BUY signal on {entry_date}: Entry Price = {entry_price}")
|
||||||
|
|
||||||
|
elif signals.iloc[i]['MA'] == -1 and current_position is not None:
|
||||||
|
# SELL signal
|
||||||
|
exit_price = option_chain['lastPrice'].iloc[i]
|
||||||
|
if pd.isna(exit_price): # If price is nan, log the error and continue
|
||||||
|
print(f"Missing exit price on {stock_data.index[i]}, skipping trade.")
|
||||||
|
continue
|
||||||
|
exit_date = stock_data.index[i]
|
||||||
|
pnl = (exit_price - current_position['entry_price']) * 100
|
||||||
|
print(f"SELL signal on {exit_date}: Exit Price = {exit_price}, P&L = {pnl}")
|
||||||
|
|
||||||
|
trades.append({
|
||||||
|
'entry_date': current_position['entry_date'],
|
||||||
|
'entry_price': current_position['entry_price'],
|
||||||
|
'exit_date': exit_date,
|
||||||
|
'exit_price': exit_price,
|
||||||
|
'pnl': pnl
|
||||||
|
})
|
||||||
|
current_position = None
|
||||||
|
|
||||||
|
cumulative_pnl = sum(trade['pnl'] for trade in trades)
|
||||||
|
total_wins = sum(1 for trade in trades if trade['pnl'] > 0)
|
||||||
|
total_trades = len(trades)
|
||||||
|
win_rate = total_wins / total_trades if total_trades > 0 else 0
|
||||||
|
|
||||||
|
return cumulative_pnl, trades, win_rate
|
||||||
|
|
||||||
|
|
||||||
|
def objective_function_profit(weights, strategy_signals, data, option_chain):
|
||||||
|
weights = np.array(weights)
|
||||||
|
weights /= np.sum(weights) # Normalize weights
|
||||||
|
weighted_signals = np.sum([signal * weight for signal, weight in zip(strategy_signals.T.values, weights)], axis=0)
|
||||||
|
|
||||||
|
# Since `backtest_option_trades` returns 3 values, we only unpack those
|
||||||
|
cumulative_pnl, _, _ = backtest_option_trades(option_chain, weighted_signals, data)
|
||||||
|
|
||||||
|
# Return negative cumulative P&L to maximize profit
|
||||||
|
return -cumulative_pnl
|
||||||
|
|
||||||
|
|
||||||
|
def optimize_weights(strategy_signals, data, option_chain):
|
||||||
|
initial_weights = [1 / len(strategy_signals.columns)] * len(strategy_signals.columns)
|
||||||
|
constraints = ({'type': 'eq', 'fun': lambda weights: np.sum(weights) - 1})
|
||||||
|
bounds = [(0, 1)] * len(strategy_signals.columns)
|
||||||
|
|
||||||
|
result = minimize(objective_function_profit, initial_weights, args=(strategy_signals, data, option_chain),
|
||||||
|
method='SLSQP', bounds=bounds, constraints=constraints)
|
||||||
|
return result.x # Optimal weights
|
||||||
|
|
||||||
|
|
||||||
|
def weighted_signal_combination(strategy_signals, weights):
|
||||||
|
weighted_signals = np.sum([signal * weight for signal, weight in zip(strategy_signals.T.values, weights)], axis=0)
|
||||||
|
return weighted_signals
|
||||||
|
|
||||||
|
|
||||||
|
def main_decision(weighted_signals):
|
||||||
|
last_signal = weighted_signals[-1] # Latest signal
|
||||||
|
if last_signal > 0:
|
||||||
|
return "BUY"
|
||||||
|
elif last_signal < 0:
|
||||||
|
return "SELL"
|
||||||
|
else:
|
||||||
|
return "HOLD"
|
||||||
|
|
||||||
|
|
||||||
|
def run_backtest():
|
||||||
|
ticker = ticker_info()
|
||||||
|
expiration_dates = fetch_expiration_dates(ticker)
|
||||||
|
expiration_date = select_expiration_date(expiration_dates)
|
||||||
|
options_chain = fetch_option_chain(ticker, expiration_date)
|
||||||
|
|
||||||
|
# Fetch training data
|
||||||
|
train_data = get_price_data(ticker, '2010-01-01', '2022-01-01')
|
||||||
|
|
||||||
|
# Generate signals
|
||||||
|
strategy_signals_train = generate_signals(train_data)
|
||||||
|
|
||||||
|
# Optimize weights
|
||||||
|
optimal_weights = optimize_weights(strategy_signals_train, train_data, options_chain.calls)
|
||||||
|
|
||||||
|
# Fetch test data
|
||||||
|
test_data = get_price_data(ticker, '2022-01-02', '2024-01-01')
|
||||||
|
|
||||||
|
# Generate test signals
|
||||||
|
strategy_signals_test = generate_signals(test_data)
|
||||||
|
|
||||||
|
# Combine signals and backtest
|
||||||
|
weighted_signals = weighted_signal_combination(strategy_signals_test, optimal_weights)
|
||||||
|
cumulative_pnl, trades, win_rate = backtest_option_trades(options_chain.calls, weighted_signals, test_data)
|
||||||
|
|
||||||
|
# Make final decision
|
||||||
|
decision = main_decision(weighted_signals)
|
||||||
|
print(f"Final decision: {decision}")
|
||||||
|
|
||||||
|
# Output results
|
||||||
|
print(f"Cumulative P&L: {cumulative_pnl}")
|
||||||
|
print(f"Win Rate: {win_rate * 100:.2f}%")
|
||||||
|
|
||||||
|
|
||||||
|
# Call the main function
|
||||||
|
run_backtest()
|
||||||
17536
src/griffin-stuff/IBKR/3_month_testing_data.csv
Normal file
17536
src/griffin-stuff/IBKR/3_month_testing_data.csv
Normal file
File diff suppressed because it is too large
Load Diff
137685
src/griffin-stuff/IBKR/3_years_training_data.csv
Normal file
137685
src/griffin-stuff/IBKR/3_years_training_data.csv
Normal file
File diff suppressed because it is too large
Load Diff
78
src/griffin-stuff/IBKR/predict_price.py
Normal file
78
src/griffin-stuff/IBKR/predict_price.py
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
import pandas as pd
|
||||||
|
import numpy as np
|
||||||
|
from sklearn.preprocessing import StandardScaler
|
||||||
|
from sklearn.metrics import mean_squared_error, mean_absolute_error
|
||||||
|
from tensorflow.keras.models import Sequential
|
||||||
|
from tensorflow.keras.layers import Dense
|
||||||
|
from tensorflow.keras.callbacks import EarlyStopping
|
||||||
|
|
||||||
|
# Load the training and testing data
|
||||||
|
training_data = pd.read_csv("3_years_training_data.csv")
|
||||||
|
testing_data = pd.read_csv("3_month_testing_data.csv")
|
||||||
|
|
||||||
|
# Drop unnecessary columns
|
||||||
|
training_data = training_data.drop(columns=["Unnamed: 0", "Date"])
|
||||||
|
testing_data = testing_data.drop(columns=["Unnamed: 0", "Date"])
|
||||||
|
|
||||||
|
# Create lagged features for the model
|
||||||
|
def create_lagged_features(data, n_lags=3):
|
||||||
|
df = data.copy()
|
||||||
|
for lag in range(1, n_lags + 1):
|
||||||
|
df[f'Close_lag_{lag}'] = df['Close'].shift(lag)
|
||||||
|
df.dropna(inplace=True) # Remove rows with NaN values due to shifting
|
||||||
|
return df
|
||||||
|
|
||||||
|
# Apply lagged features to the training and testing datasets
|
||||||
|
training_data = create_lagged_features(training_data)
|
||||||
|
testing_data = create_lagged_features(testing_data)
|
||||||
|
|
||||||
|
# Separate features and target
|
||||||
|
X_train = training_data.drop(columns=["Close"]).values
|
||||||
|
y_train = training_data["Close"].values
|
||||||
|
X_test = testing_data.drop(columns=["Close"]).values
|
||||||
|
y_test = testing_data["Close"].values
|
||||||
|
|
||||||
|
# Standardize the features
|
||||||
|
scaler = StandardScaler()
|
||||||
|
X_train = scaler.fit_transform(X_train)
|
||||||
|
X_test = scaler.transform(X_test)
|
||||||
|
|
||||||
|
# Build the neural network model
|
||||||
|
model = Sequential([
|
||||||
|
Dense(64, activation='sigmoid', input_shape=(X_train.shape[1],)),
|
||||||
|
Dense(32, activation='sigmoid'),
|
||||||
|
Dense(16, activation='sigmoid'),
|
||||||
|
Dense(1) # Output layer for regression
|
||||||
|
])
|
||||||
|
|
||||||
|
# Compile the model
|
||||||
|
model.compile(optimizer='adam', loss='mse', metrics=['mae'])
|
||||||
|
|
||||||
|
# Use early stopping to prevent overfitting
|
||||||
|
early_stopping = EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True)
|
||||||
|
|
||||||
|
# Train the model
|
||||||
|
history = model.fit(
|
||||||
|
X_train, y_train,
|
||||||
|
epochs=100,
|
||||||
|
batch_size=32,
|
||||||
|
validation_split=0.2,
|
||||||
|
callbacks=[early_stopping],
|
||||||
|
verbose=1
|
||||||
|
)
|
||||||
|
|
||||||
|
# Evaluate the model on the test set
|
||||||
|
y_pred = model.predict(X_test).flatten()
|
||||||
|
mse = mean_squared_error(y_test, y_pred)
|
||||||
|
mae = mean_absolute_error(y_test, y_pred)
|
||||||
|
|
||||||
|
print(f"Neural Network MSE: {mse:.2f}")
|
||||||
|
print(f"Neural Network MAE: {mae:.2f}")
|
||||||
|
|
||||||
|
# Prepare the latest data to predict tomorrow's price
|
||||||
|
latest_data = testing_data.tail(1).drop(columns=["Close"])
|
||||||
|
latest_data_scaled = scaler.transform(latest_data)
|
||||||
|
|
||||||
|
# Predict tomorrow's close price
|
||||||
|
tomorrow_pred = model.predict(latest_data_scaled)
|
||||||
|
print(f"Predicted Close Price for Tomorrow: {tomorrow_pred[0][0]:.2f}")
|
||||||
47
src/griffin-stuff/IBKR/requirements.txt
Normal file
47
src/griffin-stuff/IBKR/requirements.txt
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
absl-py==2.1.0
|
||||||
|
astunparse==1.6.3
|
||||||
|
certifi==2024.8.30
|
||||||
|
charset-normalizer==3.4.0
|
||||||
|
flatbuffers==24.3.25
|
||||||
|
gast==0.6.0
|
||||||
|
google-pasta==0.2.0
|
||||||
|
grpcio==1.67.1
|
||||||
|
h5py==3.12.1
|
||||||
|
ibapi==9.81.1.post1
|
||||||
|
idna==3.10
|
||||||
|
importlib_metadata==8.5.0
|
||||||
|
joblib==1.4.2
|
||||||
|
keras==3.6.0
|
||||||
|
libclang==18.1.1
|
||||||
|
Markdown==3.7
|
||||||
|
markdown-it-py==3.0.0
|
||||||
|
MarkupSafe==3.0.2
|
||||||
|
mdurl==0.1.2
|
||||||
|
ml-dtypes==0.4.1
|
||||||
|
namex==0.0.8
|
||||||
|
numpy==2.0.2
|
||||||
|
opt_einsum==3.4.0
|
||||||
|
optree==0.13.0
|
||||||
|
packaging==24.1
|
||||||
|
pandas==2.2.3
|
||||||
|
protobuf==5.28.3
|
||||||
|
Pygments==2.18.0
|
||||||
|
python-dateutil==2.9.0.post0
|
||||||
|
pytz==2024.2
|
||||||
|
requests==2.32.3
|
||||||
|
rich==13.9.4
|
||||||
|
scikit-learn==1.5.2
|
||||||
|
scipy==1.13.1
|
||||||
|
six==1.16.0
|
||||||
|
tensorboard==2.18.0
|
||||||
|
tensorboard-data-server==0.7.2
|
||||||
|
tensorflow==2.18.0
|
||||||
|
tensorflow-io-gcs-filesystem==0.37.1
|
||||||
|
termcolor==2.5.0
|
||||||
|
threadpoolctl==3.5.0
|
||||||
|
typing_extensions==4.12.2
|
||||||
|
tzdata==2024.2
|
||||||
|
urllib3==2.2.3
|
||||||
|
Werkzeug==3.1.1
|
||||||
|
wrapt==1.16.0
|
||||||
|
zipp==3.20.2
|
||||||
Reference in New Issue
Block a user