Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BigQuery driver – Error: invalid_grant: Invalid JWT: Token must be a short-lived token #372

Closed
philippefutureboy opened this issue Feb 11, 2020 · 8 comments
Labels
question The issue is a question. Please use Stack Overflow for questions.

Comments

@philippefutureboy
Copy link
Contributor

Describe the bug
When using the Cube.js in development mode with BigQuery driver, I get the following error:

datawarehouse-api    | Error: invalid_grant: Invalid JWT: Token must be a short-lived token (60 minutes) and in a reasonable timeframe. Check your iat and exp values in the JWT claim.
datawarehouse-api    |     at Gaxios._request (/workdir/cube/node_modules/gaxios/build/src/gaxios.js:85:23)
datawarehouse-api    |     at processTicksAndRejections (internal/process/task_queues.js:97:5)
datawarehouse-api    |     at async GoogleToken.requestToken (/workdir/cube/node_modules/gtoken/build/src/index.js:202:23)
datawarehouse-api    |     at async JWT.refreshTokenNoCache (/workdir/cube/node_modules/google-auth-library/build/src/auth/jwtclient.js:156:23)
datawarehouse-api    |     at async JWT.getRequestMetadataAsync (/workdir/cube/node_modules/google-auth-library/build/src/auth/oauth2client.js:265:17)
datawarehouse-api    |     at async JWT.getRequestHeaders (/workdir/cube/node_modules/google-auth-library/build/src/auth/oauth2client.js:244:26)
datawarehouse-api    |     at async GoogleAuth.authorizeRequest (/workdir/cube/node_modules/google-auth-library/build/src/auth/googleauth.js:544:25)

To Reproduce
Steps to reproduce the behavior:
(disclaimer, these steps haven't been attempted to check if they reproduces the behaviour)

  1. Create the following Dockerfile:
FROM node:alpine
WORKDIR /workdir
RUN npm install -g cubejs-cli
RUN cubejs create cube -d bigquery
WORKDIR cube
CMD ["npm", "run", "dev"]
  1. Create the following docker-compose file:
version: "3.7"
services:
  datawarehouse-api:
    build:
      context: .
    container_name: datawarehouse-api
    env_file:
      - ./.env
    ports:
      - "5110:4000"
    volumes:
      - "./schema:/workdir/cube/schema"
  1. Specify valid values in the env for the following variables:
CUBEJS_DB_BQ_PROJECT_ID
CUBEJS_DB_BQ_CREDENTIALS
CUBEJS_DB_TYPE
CUBEJS_API_SECRET
  1. Copy the default Orders schema to the ./schema folder
  2. Attempt running a query agains the Orders schema

Expected behavior
No JWT errors

Version:

$ docker container exec datawarehouse-api cubejs --version
0.14.0
$ docker container exec datawarehouse-api cat package.json
{
  "name": "cube",
  "version": "0.0.1",
  "private": true,
  "scripts": {
    "dev": "node index.js"
  },
  "dependencies": {
    "@cubejs-backend/bigquery-driver": "^0.14.0",
    "@cubejs-backend/server": "^0.14.3"
  }
}
@paveltiunov
Copy link
Member

@philippefutureboy Hey Philippe! Interesting. Could you please provide all the steps starting from service account creation and ending with setting actual value of CUBEJS_DB_BQ_CREDENTIALS?

@paveltiunov paveltiunov added the question The issue is a question. Please use Stack Overflow for questions. label Feb 11, 2020
@philippefutureboy
Copy link
Contributor Author

philippefutureboy commented Feb 11, 2020

@paveltiunov Yup!

  1. Create service account in GCP Console & grant Owner access for the project
  2. Save the JSON key of the service account to a .json file on your filesystem
  3. Run cat gcp_service_key.json | base64 -w
  4. Paste the previous step's result in the value of CUBEJS_DB_BQ_CREDENTIALS

@paveltiunov
Copy link
Member

@philippefutureboy Are you on mac? Here's what I found:
googleapis/google-cloud-python#3100
docker/for-mac#2076
Restart your docker or mac can be a solution here.

@philippefutureboy
Copy link
Contributor Author

@paveltiunov Thanks! However the bug was produced while on Windows 10 x64

@paveltiunov
Copy link
Member

paveltiunov commented Feb 11, 2020

@philippefutureboy Do you think it has correct time set on a machine? If so could you please attach terminal to docker container and check clock there?

@philippefutureboy
Copy link
Contributor Author

@paveltiunov Confirmed that this is the issue - docker container clock has drifted!

$ docker container exec datawarehouse-api date
Mon Feb  3 12:13:26 UTC 2020

Thanks a lot for your help! We'll take it on from there, and post the steps done here for documentation purposes.

@sivanaltinakar
Copy link

@paveltiunov Thanks a lot for the help!

@philippefutureboy
Copy link
Contributor Author

@paveltiunov The problem was related to the drift of the clock of the containers on Windows, so we opted to drop local development and simply have it run on Google Kubernetes Engine instead :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question The issue is a question. Please use Stack Overflow for questions.
Projects
None yet
Development

No branches or pull requests

3 participants