Uncategorized

server – How to host python project the cloud (specifically Google Cloud Platform)


I have a pretty simple (I think) questions about hosting a personal project on a cloud server. Currently, I am manually running my script every morning. I would like to automate this process by scheduling a run of this script on a cloud server so I can “set it and forget”.

Project details: I have a project that scrapes NBA data from various sites and make a pick for each matchup that day. It tweets these picks and saves the picks to a sqlite database so it can check the picks on the next run. Below is the structure of the project.

twitterBot
├── __pycache__
│   ├── config.cpython-311.pyc
│   └── twitter.cpython-311.pyc
├── config.py
├── main.py
├── nba_models
│   ├── __init__.py
│   ├── __pycache__
│   │   ├── __init__.cpython-311.pyc
│   │   ├── config.cpython-311.pyc
│   │   └── get_picks.cpython-311.pyc
│   ├── get_picks.py
│   ├── nba_pick_data_2023.db
│   ├── nba_pick_recent_data_2023.db
│   ├── nba_scraper
│   │   ├── __init__.py
│   │   ├── __pycache__
│   │   │   ├── __init__.cpython-311.pyc
│   │   │   └── scraper.cpython-311.pyc
│   │   └── scraper.py
│   └── vegas
│       ├── __init__.py
│       ├── __pycache__
│       │   ├── __init__.cpython-311.pyc
│       │   └── vegas.cpython-311.pyc
│       └── vegas.py
├── test.ipynb
└── twitter.py

Since the project is kind of complicated, below is a simplified version of the project that contains most of the fundamentals. I would use this directory to test/get the hang of whatever services I end up using.

cloudTest is the root directory and contains the below files.

cloudTest
├── __pycache__
│   └── config.cpython-311.pyc
├── config.py
├── example.db
├── pic.png
└── test.py

test.py is below and contains most of the functionality my main project has.

import sqlite3
import os
import tweepy
import config

# Creates and insert some data in sqlite database
current_directory = os.getcwd()
db_file_path = os.path.join(current_directory, 'example.db')
conn = sqlite3.connect(db_file_path)
cur = conn.cursor()

table_creation_query = '''
CREATE TABLE IF NOT EXISTS example_table (
    id INTEGER PRIMARY KEY,
    name TEXT,
    age INTEGER
);
'''
cur.execute(table_creation_query)

# Get the maximum existing ID
max_id_query = 'SELECT MAX(id) FROM example_table'
cur.execute(max_id_query)
max_id = cur.fetchone()[0]

# Generate new data programmatically
new_name = "New User"
new_age = 25

# Increment the ID for the new row
new_id = max_id + 1 if max_id is not None else 1

# Insert new data into the table
insert_query = 'INSERT INTO example_table (id, name, age) VALUES (?, ?, ?)'
cur.execute(insert_query, (new_id, new_name, new_age))

# Commit the changes and close the connection

cur.execute(f"select * from example_table where id = {new_id}")
new_data = cur.fetchone()
conn.commit()
conn.close()


auth = tweepy.OAuth1UserHandler(config.CONSUMER_KEY, config.CONSUMER_SECRET)
auth.set_access_token(config.ACCESS_TOKEN, config.ACCESS_TOKEN_SECRET)
api = tweepy.API(auth)

client = tweepy.Client(
            consumer_key=config.CONSUMER_KEY,
            consumer_secret=config.CONSUMER_SECRET,
            access_token=config.ACCESS_TOKEN,
            access_token_secret=config.ACCESS_TOKEN_SECRET,
        )


media = api.simple_upload(f"{current_directory}/pic.png")
client.create_tweet(text=f"Just inserted {new_data}", media_ids=[media.media_id])

I have tried looking online for answers but I am not super familiar with cloud providers/external servers so I do not know where to start. At first I thought this would be a relatively simple problem to solve, but after some research it feels like it is complicated. Can someone point me in the right direction? Are there any good resources on this topic?

Below are some details that might be useful:

  • I have just created a GCP account and have $300 to work with.

  • I have seen a lot of suggestions for Google Functions for this type of work but I think my project might be too large for that. I define two python packages that are nested in another package. I am also not sure how the sqlite databases would work in Google Functions. I am install/used multiple packages that do not come with the default python. But if anyone thinks this is the right idea please let me know. I also

  • I have seen suggestions for something like digitalOcean. Is this the right route?

If more details are needed please let me know. I appreciate any and all help.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *