My customer has send me a database with 329,866 entries. I have got an XML export and the corresponding xdt. Creating a TB is not a problem. But I cannot import the TB, after importing it in various ways I have 0 entries and a log file, telling me, that entries up to 102,494 has been added. From this entry up to the end of the database it says, that the entry does not fit the termbase definition (this can be excluded, as the XML does fit the TB definition for sure). There is a reason given: Database ´´ could not be opened. Either the database is not recognized by your application or the file is damaged. Source: MultiTerm140.Server.Termbase14.0
Unfortunately I am not allowed to pass the xml or the xdt to anyone. But maybe you ca give me a hint what to do based on the error messages I posted.
is it a MultiTerm Server termbase or a local termbase?According to (not only my) experience such large termbases only work reliably on MultiTerm Server or SDL Language Cloud Terminology (if you are allowed to upload your client's data there).
Have you checked the "Reorganize" option when creating the local termbase? For such large imports, it is better not to do so as often the reorganization step fails and corrupts the termbase.You may try to manipulate the registry key mentioned in https://gateway.sdl.com/apex/communityknowledge?articleName=Error-message-when-importing-a-termbase-in-SDL-MultiTerm-File-sharing-lock-count-exceeded-Increase-MaxLocksPerFile-registry-entry if you want to reorganize your termbase in a separate step later.Another root cause can be sync applications that are running in the background, see gateway.sdl.com/.../communityknowledgeGood luck!Christine
Many thanks for your detailed answer.
This was a server TB, exported to XML. I know the limits, though my customer claims to have tested the XML import to a local TB with MT2015!
I have already increased the max locks to 50,000 - and it helped partially.
So to start from the beginning:
When I attempt to process the data located on NAS, the complete process fails.
Attempting to do so on a local HDD fails too, however in both cases I have chosen to reorganize the TB after import.
The last attempt was to process the data on a local SSD, which has lead to a partial success - I was able to import over 100,000 entries.
I can exclude sync operations, as all my data is stored locally and NOT synced with Internet from the PC.
I will certainly inform the customer, that such a TB will most probably not work on 99% of freelance computers, as there are not many freelancers having machines with such a potential as mine. Many if not most are more or less higher than average laptops, but certainly not with the highest i7 processors, tons of RAM and huge SSDs. And I will ask for permission to upload to LC (do I have one?).
Hi Jerzy,I fully agree that such large termbases will not work on 99% of freelance computers.I think it is important that GroupShare / MultiTerm Server customers are informed that large server termbase exports mostly do not work reliably in local MultiTerm versions. I have tried to encourage SDL to spread/publish such information, but without success.
Each freelancer has one free Language Cloud Terminology termbase; there are paid subscriptions models available if you need more than one.If you have participated in the LC Terminology Beta, you might still have more termbases available: You can activate just one but switch between the available termbases without losing the other ones.
But ultimately, it would be better if the customer makes the termbase available online - via a GroupShare/MultiTerm Server installation that is accessible to the translator, or via LC Terminology (there are also business packages available, see https://gateway.sdl.com/apex/communityknowledge?articleName=000008698 - but I cannot say whether SDL LC Terminology is already mature enough to actually recommend it to a corporate customer).