Cache Pull not generating StackID

Hi!
I have different workflows for cache_pulling(includes git-cloning), pod_install, fastlane_build and cache_pushing, but they all are using the same stack. But I am facing a problem on cache-push not generating stackID every time I run it, reason why it will skip cache-pull on the next run.
But if I try to include the cache-push step to the fastlane_build workflow, it will generate stackID. Can someone help me on what to do so that cache-push (in different workflow) will generate stackID ? Thank you!

Does switching to another workflow change/update the stack ?

Hello,

The cache is per branch, not per workflow. For more info on using the cache push, please review the info on caching in the devcenter

Thanks,
cathy

@cathy.harmon

Thank you for the reply.
But still encountering the issue of not generating stack ID
Below is the log of cache:push

Switching to workflow: after_build_workflow_test
±-----------------------------------------------------------------------------+
| (0) cache-push@2.2.2 |
±-----------------------------------------------------------------------------+
| id: cache-push |
| version: 2.2.2 |
| collection: GitHub - bitrise-io/bitrise-steplib: New Bitrise StepLib |
| toolkit: go |
| time: 2021-02-16T15:05:23Z |
±-----------------------------------------------------------------------------+
| |
Config:

  • Paths:
    ./Pods → ./Podfile.lock
    /Users/vagrant/git/Pods → /Users/vagrant/git/Podfile.lock
  • IgnoredPaths:
  • CacheAPIURL: [REDACTED]
  • FingerprintMethodID: file-content-hash
  • CompressArchive: false
  • DebugMode: true
  • StackID:
    Cleaning paths
    Done in 490.787215ms
    Checking previous cache status
    No previous cache info found
    Done in 8.278901ms
    Generating cache archive
    Done in 12.74057199s
    Uploading cache archive
    Archive file size: 784351232 bytes / 748.015625 MB
    Done in 19.207472504s

It is working now, updated cache:pull version to latest. Thank you!

Hi @johnreyquiros! :wave:

We are really glad to hear the issue has been resolved. :tada: I’ll go ahead and close this thread but please feel free to contact us if you have any further questions!

Take care!
~Kata