Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PuterBot-46: Add Publishing using magic-wormhole #64

Closed
wants to merge 45 commits into from

Conversation

Mustaballer
Copy link
Collaborator

@Mustaballer Mustaballer commented May 2, 2023

Overview

Unit Tests Created

  • will do after review & iteration

Steps to QA

  • After creating a recording, run these following commands in your venv environment
  • For sending a recording: python -m puterbot.share send --recording_id=1
  • For receiving a recording:
    • create a recordings folder i.e. puterbot/puterbot/recordings and navigate to it
    • run python -m puterbot.share receive --wormhole_code=<wormhole_code>

@abrichr
Copy link
Member

abrichr commented May 2, 2023

Let's replace --output_folder=output with config.RECORDING_DIR_PATH

@Mustaballer Mustaballer self-assigned this May 15, 2023
@Mustaballer Mustaballer marked this pull request as draft May 17, 2023 18:46
@Mustaballer
Copy link
Collaborator Author

Mustaballer commented May 17, 2023

@abrichr What should the path of config.RECORDING_DIR_PATH be?

Right now I'be broken down the tasks into the following:

  • Store Screenshot differences in db
  • figure out how to convert to png_data on save
  • zip the Screenshot differences
  • use magic-wormhole to send the zip archive of screenshot diffs

@Mustaballer

This comment was marked as off-topic.

@Mustaballer
Copy link
Collaborator Author

@abrichr Currently, I am successful in sending a compressed zipped file using magic-wormhole!

This image is for sending the zipped file, it includes zipping the recording screenshots --> sending the zipped file in a wormhole --> completes when the zipped file is receieved.
image

Question/Issue:

  • Right now I am sending every screenshot associated with a recording as a compressed zip file. I know the plan is to store the image diffs in the db, is that also what we want to share across computers as opposed to a db file?
  • I currently use the highest compression for these screenshots, but when I extract these images after receiving the zip file, I am unable to view the png, I get file type is unsupported. Without compressing the zip file I am able to view the pngs after extracting but it ends up being minimum a GB in storage.

@Mustaballer
Copy link
Collaborator Author

@abrichr Currently, I am using magic-wormhole to transfer the entire database file. I attempted to create a new database file and add the recording to it, but encountered errors in the process. It appears that there is a workaround available by marking the recording as transient and adding it to the new database. However, this approach removes the recording from the old database, which is not the desired outcome. As a temporary solution, we can continue sending the entire database file. What are your thoughts on this?

@abrichr
Copy link
Member

abrichr commented May 25, 2023

@Mustaballer can you please clarify what it means to make a recording as transient?

I think what we want is this:

  1. Export to sql:
def export_sql(recording_id):
    # Export data for recordings e.g. via SELECT or similar
    return sql

  1. Create new database
def create_db(recording_id):
    fname_parts = [
        config.DB_FNAME,
        recording_id,
        datetime.now().strftime(config.DT_FMT),
    ]
    db_fname = "-".join(fname_parts)

    # append to .env before running alembic
    # backup first
    t = time.time()
    os.system("cp {config.ENV_FILE_PATH} {config.ENV_FILE_PATH}-{t}")
    with open(config.ENV_FILE_PATH, "a") as f:
        f.seek(0, os.SEEK_END)
        f.write(f"\nDB_FNAME={DB_FNAME}")
    os.system("alembic upgrade head")

    # update current running configuration
    config.set_db_fname(db_fname)
    db.engine = db.get_engine()
    return t

puterbot/config.py:


ENV_FILE_PATH = (ROOT_DIR_PATH / ".." / ".env").resolve()
logger.info(f"{ENV_FILE_PATH=}")
load_dotenv(ENV_FILE_PATH)

def set_db_fname(db_fname):
    global DB_FNAME
    DB_FNAME = db_fname
    set_db_fpath()

def set_db_fpath():
    global DB_FPATH
    DB_FPATH = ROOT_DIRPATH / DB_FNAME

def set_db_url():
    global DB_URL
    DB_URL = f"sqlite:///{DB_FPATH}"
    logger.info(f"{DB_URL=}")

set_db_url()

alembic/env.py:

from puterbot import config
...
def get_url():
    return config.DB_URL
  1. Import data from exported SQL:
def run_sql(sql):
    with db.engine.connect() as con:
        result = con.execute(sql)
        for row in result:
            logger.info(f"{row=}")

Altogether:

def export_recording(recording_id):
    sql = export_sql(recording_id)
    t = create_db(recording_id)
    run_sql(sql)
    restore_db(t)
    # TODO: undo configuration changes made in create_db

Untested and needs refactoring :)

@abrichr
Copy link
Member

abrichr commented May 31, 2023

Hi @Mustaballer what is the status of this?

@Mustaballer
Copy link
Collaborator Author

The current status is still in progress. The next steps are

  1. Merge with latest changes from main
  2. Resolve issues with env file
  3. Create unit tests

@Mustaballer
Copy link
Collaborator Author

@abrichr I resolved the issue with creating a new db file, the issue I currently face is importing the recording to the new db file. This code executes a SQL query on the db, however it doesn't actually import the data

def run_sql(sql):
    with db.engine.connect() as con:
        result = con.execute(sql)
        for row in result:
            logger.info(f"{row=}")

@Mustaballer
Copy link
Collaborator Author

Will close this PR, and open a new one without all the formatting changes

@Mustaballer Mustaballer closed this Jun 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement Publishing
2 participants