Hi everyone,
I have a batch of files I am trying to process I have managed to align them all to a common file and now I am running a script designed to extract the ligands or binding sites and ligands.
It is not perfect (gets some ions and crystalisation solvents), but it works well enough on a per structure basis.
The script is attached and at the end.
My problem is that it seems to crash my client after 40+ files. It is variable but I can't process the full 139 file in one go. I suspect some form of memory leak as I can see a constant increase in page file use as the script progresses via the task manager. Interestingly if I give it a smaller batch to process I still see the constant increase however once the script completes the page file use drops back to the starting level, so the memory does get cleared eventually just not during my script.
Any ideas where I have gone wrong?
Tom
#!/usr/bin/perl -w
use strict;
use MdmDiscoveryScript;
use ProteinDiscoveryScript;
use MdmCommands;
use SbdDiscoveryScript;
open(xray, "xray.csv") || die "cannot get the file";
my @pdb=
close(xray);
print "\n@pdb \n";
my \$pdb_size=@pdb;
print \$pdb_size;
chomp (@pdb);
print "\nafter chomp: \n @pdb \n \n";
\$pdb_size=@pdb;
print "\npost chomp size: \$pdb_size\n";
print \$pdb[1];
my \$tail = '_lig';
my \$pdb_count=0;
while (\$pdb_count < \$pdb_size)
{
my \$pdb_ref = \$pdb_count;
my \$name = \$pdb[\$pdb_ref];
print " \n \$name \n ";
my \$document = DiscoveryScript::Open("\$name.pdb");
my \$ligandschain = GetLigands( \$document );
my \$check = \$ligandschain->IsEmpty;
if (!\$check)
{
SetSelection(\$document, \$ligandschain);
# \$document->SelectByRadius( 7.0, 'AminoAcid' );
\$document->InvertSelection();
\$document->DeleteObjects();
my \$file_name = \$name.\$tail;
print " \n \$file_name \n";
\$document->Save( "\$file_name.pdb", 'pdb' );
}
\$document->Close();
\$pdb_count++;
}