-
Notifications
You must be signed in to change notification settings - Fork 2k
Description
What version of Turborepo are you using?
1.0.24
What package manager are you using / does the bug impact?
Yarn v1
What operating system are you using?
Linux
Describe the Bug
I'm seeing intermittent cache misses that I believe to be caused by a failure to PUT/GET a given cache key to/from Vercel. In most cases I'm seeing no log output to indicate that the HTTP request failed—this is why I'm unsure whether the failure occurs on PUT or GET—but, on one occasion I saw some log output in Vercel that may be helpful. See below.
First run, this behavior is expected; this cache entry should not exist, yet, so the build is executed:
2022-01-03T11:42:43.854Z [DEBUG] run.api:build: start
2022-01-03T11:42:43.854Z [DEBUG] run.api:build: task output globs: outputs=[".turbo/turbo-build.log", "cdk.out/**"]
2022-01-03T11:42:43.854Z [DEBUG] run.api:build: task hash: value=3dd8da4e3f3dd6b0
2022-01-03T11:42:43.854Z [DEBUG] run.api:build: log file: path=apps/api/.turbo/turbo-build.log
2022-01-03T11:42:43.855Z [DEBUG] run: performing request: method=GET url=https://api.vercel.com/v8/artifacts/3dd8da4e3f3dd6b0?slug=***
...
api:build: cache miss, executing 3dd8da4e3f3dd6b0
...
2022-01-03T11:43:10.928Z [DEBUG] run.api:build: caching output: outputs=[".turbo/turbo-build.log", "cdk.out/**"]
2022-01-03T11:43:10.945Z [DEBUG] run.api:build: done: status=complete duration=27.091337379s
2022-01-03T11:43:11.355Z [DEBUG] run: performing request: method=PUT url=https://api.vercel.com/v8/artifacts/3dd8da4e3f3dd6b0?slug=***
I then re-run the job and see an unexpected cache miss:
2022-01-03T12:06:04.257Z [DEBUG] run.api:build: start
2022-01-03T12:06:04.258Z [DEBUG] run.api:build: task output globs: outputs=[".turbo/turbo-build.log", "cdk.out/**"]
2022-01-03T12:06:04.258Z [DEBUG] run.api:build: task hash: value=3dd8da4e3f3dd6b0
2022-01-03T12:06:04.259Z [DEBUG] run.api:build: log file: path=apps/api/.turbo/turbo-build.log
2022-01-03T12:06:04.259Z [DEBUG] run: performing request: method=GET url=https://api.vercel.com/v8/artifacts/3dd8da4e3f3dd6b0?slug=***
...
api:build: cache miss, executing 3dd8da4e3f3dd6b0
...
2022-01-03T12:06:31.097Z [DEBUG] run.api:build: caching output: outputs=[".turbo/turbo-build.log", "cdk.out/**"]
2022-01-03T12:06:31.118Z [DEBUG] run.api:build: done: status=complete duration=26.860623477s
2022-01-03T12:06:31.644Z [DEBUG] run: performing request: method=PUT url=https://api.vercel.com/v8/artifacts/3dd8da4e3f3dd6b0?slug=***
The above output is from GitHub Actions. In one instance (not the above case), I did see the following log output in Vercel that may be a hint:
2022-01-02T12:17:21.421Z [DEBUG] run.web:build: done: status=complete duration=25.849629175s
--
06:17:21.423 | [ERROR] Error uploading artifacts to HTTP cache: archive/tar: write too long
The cache artifacts for the api:build
task in the GitHub Actions case are 2.2 MB (zipped). The artifacts for the web:build
task in Vercel (that had the visible write error) are 19.7 MB (zipped).
Expected Behavior
I expect that a given cache key will be successfully PUT to the Vercel remote cache and then retrieved on subsequent runs when a matching hash is calculated.
To Reproduce
As mentioned, that cache artifacts where I've seen failures are 2.2 MB (zipped) and 19.7 MB (zipped). I wouldn't expect that the actual contents of the cache are pertinent, but I can provide them if needed.
In terms of repro steps, I'm running turbo in GitHub Actions to deploy a CDK app to AWS, and in Vercel to deploy a Next.js app. I've seen these intermittent failures on both platforms.