Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
    • Contribute to GitLab
  • Sign in
  • W Waarp Gateway
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
  • Issues 48
    • Issues 48
    • List
    • Boards
    • Service Desk
    • Milestones
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
  • Deployments
    • Deployments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • Applications
  • Waarp Gateway
  • Waarp Gateway
  • Issues
  • #200
Closed
Open
Issue created Oct 30, 2020 by Paolo Pantellini@paolo.pantelliniMaintainer

Reduce the amount of database update by the pipeline

During the data transfer, the pipeline updates the transfer progress each time the Read/Write function is called (so roughly once for each SFTP packet, or once per R66 block). This is way too much, and in the case of large file transfers with rapid connections, it may lead to a severe performance penalty.

Thus the pipeline should be changed to only update the database at fixed time interval, so that the database does not bottleneck the data transfer.

To upload designs, you'll need to enable LFS and have an admin enable hashed storage. More information
Assignee
Assign to
Time tracking