RE: Publishing time in SDLp

Hi,

We are in requirement t publish almost 9000 Publication objects, but SDL is able to publish maximum three in parallel.

We don't how long would it take for 9000 Publication objects to complete the publication. We are using API to trigger the Publication.

Is there any way we can speed up this process. Help is much appreciated.

 

 

Thanks,

PaulGregory

Parents
  • Any solution would probably require significant additional resources and a redesign of the publishing architecture. Unless you republish all 9000+ publications at the beginning of a week when everyone is on vacation.

    Some off-hand crazy ideas:

    1. Use ISHServer / ISHBootstrap / ISHDeploy to spin up a couple hundred instances of SDL and offload the tasks to those VMs.
    - ISHBootstrap: github.com/.../ISHBootstrap
    - ISHServer: github.com/.../ISHServer
    - ISHDeploy: github.com/.../ISHDeploy

    2. Build a mechanism in your DITA-OT plugin to offload publishing tasks to a VM cluster that sits behind a load balancer.

    3. Run several hundred Docker instances in parallel that:
    - Grab a subset of your 9000+ publications
    - Look up the baseline for each publication
    - Export all topics, maps, images, etc from the baseline
    - Run content through a local instance of the DITA-OT (with your plugins installed)
    - Copy the output over to a site where writers can find them

    The last option may be the least desirable, because it separates the output in SDL from the output your writers use.
Reply
  • Any solution would probably require significant additional resources and a redesign of the publishing architecture. Unless you republish all 9000+ publications at the beginning of a week when everyone is on vacation.

    Some off-hand crazy ideas:

    1. Use ISHServer / ISHBootstrap / ISHDeploy to spin up a couple hundred instances of SDL and offload the tasks to those VMs.
    - ISHBootstrap: github.com/.../ISHBootstrap
    - ISHServer: github.com/.../ISHServer
    - ISHDeploy: github.com/.../ISHDeploy

    2. Build a mechanism in your DITA-OT plugin to offload publishing tasks to a VM cluster that sits behind a load balancer.

    3. Run several hundred Docker instances in parallel that:
    - Grab a subset of your 9000+ publications
    - Look up the baseline for each publication
    - Export all topics, maps, images, etc from the baseline
    - Run content through a local instance of the DITA-OT (with your plugins installed)
    - Copy the output over to a site where writers can find them

    The last option may be the least desirable, because it separates the output in SDL from the output your writers use.
Children
No Data