Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unify sampling params for reject sampling #809

Closed
npatki opened this issue May 20, 2022 · 0 comments · Fixed by #862
Closed

Unify sampling params for reject sampling #809

npatki opened this issue May 20, 2022 · 0 comments · Fixed by #862
Labels
feature request Request for a new feature
Milestone

Comments

@npatki
Copy link
Contributor

npatki commented May 20, 2022

Problem Description

  1. Our sampling methods assume sample (for all models) and GaussianCopula would never use reject sampling. This is incorrect. Due to constraints, any sampling method for any model might need reject sampling.
  2. We are using batch_size and batch_size_per_try as interchangeable concepts when they are opposites: batch_size is intended to be small (to control output and save progress), but batch_size_per_try is intended to be large (for reject sampling)

Expected behavior

Have the same parameters for all sampling methods (sample, sample_conditions and sample_remaining_columns) for all single table models.

  • batch_size which will be used in the intended way: Users should be able to split up large sampling tasks into smaller sizes
    • Requirement: batch_size <= total num_rows
    • Default: batch_size = total num rows
  • max_tries_per_batch, which will be used when there is any kind of reject sampling involved
    • Default: 10

Logic

For every batch, for every try: Dynamically update the # of rows we try to sample to achieve the full batch_size.

TabularPreset

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request Request for a new feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants