Sorption: Bug?

Hi all,

I have to optimize a force field for guest-guest interaction for the probe of Argon at 87 Kelvin and for doing that, I have to compare the simulated isotherm with the real gas density found on the NIST site, mapping the same gas phase and condensing the gas at the right pressure point.

At the beginning, I took a 1000 A^3 void box to simulate my isotherm in this way:
Quality ultra-fine, no charge, electrostatic method: Ewald, van der Waals methos: atom based, cut off 20 A .
I thought I had come to convergence by finding two parameters of epsilon and sigma that would resume the NIST trend and condense the gas at 95 KPa pressure.

To certify what I found I tried with two bigger boxes (8000 and 27000 A^3) with the same setup used before. With my great surprise my optimized forcefield didn't condense the argon to 95 KPa, but around 300 KPa in both case.

How is this happening explained?? The average loading in the box of 1000 A^3 is much less than 1 atom (0,050).
Could this be the cause?

Thanks,

Ale