SDL Trados Studio
SDL Trados GroupShare
SDL Trados Business Manager
SDL Trados Live
SDL MultiTerm
SDL Passolo
SDL Speech to Text
SDL Managed Translation - Enterprise
SDL MultiTrans
SDL TMS
SDL WorldServer
Translation Management Connectors
SDL LiveContent S1000D
SDL Contenta S1000D
SDL XPP
SDL Tridion Docs
SDL Tridion Sites
SDL Content Assistant
SDL Machine Translation Cloud
SDL Machine Translation Connectors
SDL Machine Translation Edge
Language Developers
Tridion Developers
Tridion Docs Developers
Xopus Developers
Community Help
SDL User Experience
Language Products - GCS Internal Community
SDL Community Internal Group
SDL Access Customer Portal
SDL Professional Services
SDL Training & Certification
Style Guides
Language Technology Partner Group
SDL Academic Partners
SDL Approved Trainers
SDL Enterprise Technology Partners
XyUser Group
ETUG (European Trados User Group) Public Information
Machine Translation User Group
Nordic SDL Tridion Docs User Group
SDL Tridion UK Meetup
SDL Tridion User Group New England
SDL Tridion West Coast User Group
SDL WorldServer User Group
Tridion Docs Europe & APAC User Group
Tridion User Group Benelux
Tridion User Group Ohio Valley
SDL MultiTerm Ideas
SDL Passolo Ideas
SDL Trados GroupShare Ideas
SDL Trados Studio Ideas
SDL Machine Translation Cloud Ideas
SDL Machine Translation Edge Ideas
SDL Language Cloud TMS Ideas
SDL Language Cloud Terminology Ideas
SDL Language Cloud Online Editor Ideas
SDL Managed Translation - Enterprise Ideas
SDL TMS Ideas
SDL WorldServer Ideas
SDL Tridion Docs Ideas
SDL Tridion Sites Ideas
SDL LiveContent S1000D Ideas
SDL XPP Ideas
Events & Webinars
To SDL Documentation
To SDL Support
What's New in SDL
Detecting language please wait for.......
Is there a limitation on or recommendation about the size of your TM? I have not been able to use a 14GB TM with a 2.5 million TUs because it will cause Studio to crash. Yes, I waited several hours and retried several times until it was upgraded. No, it didn't happen before Studio 2019 (although the TM was smaller back then).
There isn't strictly a limit, but with the more recent versions of Studio we are also extracting fragments so the size of the tM and the work it has to do can definitely impact the ability to handle larget TMs. I played around with a fairly large TM containing 5 millions TUs, so double the size of yours. I was able to convert this, even building a translation model for UpLift:
Certainly took a while, but it was possible. However, it's also worth noting that the actual content of the TM could also affect this. In this case you can see many of the duplicates, or or unnecessary TUs were removed so I ended up with 3.4 million TUs. If the content was heavily tagged, long sentences, few recognised tokens etc. then it could have been larger and may have even failed.
So there is no black and white answer. But in general, we are talking about a desktop tool here that is deigned to extract as much use as it possibly can from the TM, and for this it takes a lot of resource. In practice I think that once you start getting to around a million TUs you can expect performance issues. At least that's the sort of experience I see.
We are also always working on improving this, so we may be able to handle larger and more complex data in the future. Until then it may be wise to split a large TM up if you don't get the results you are looking for. Being able to work with multiple TMs is helpful in this way.