Abaqus version: 2018.hf4
We have created a ODB and writing a Tensor data at the nodes.
There are 4 steps each includes 24 frames for 27k nodes.
So, the data size is very huge --> (272107, 96, 6).
When we are writing this data, we checked the memory usage using following
option:
session.kernelMemoryFootprint
For every frame it is writing the data, the memory usage is
increasing by ~300MB. At the end, the total memory usage is ~25-30GB.
Could anybody please explain why this much memory is required here?
Secondly, I thought closing the odb will release the memory,
however, then also the consumed memory usage (~25-30GB) is NOT released.
Due to this high memory usage, Abaqus/CAE is crashing.
How can we release this memory?
Following function, we have defined to write the data to the odb:
def write_to_tempODB(temp_subodb, isn, steps_info, data, node_labels):
subodb_inst = temp_subodb.rootAssembly.instances[isn]
data_count = 0
for i, (stpname, vals) in enumerate(steps_info.items()):
# Accessing step related information
stp_descr, stp_dom, stp_tp, stp_tt, stp_pstp, frame_data = vals
print('Writing step data', stpname)
if temp_subodb.steps.has_key(stpname):
istep = temp_subodb.steps[stpname]
else:
istep = temp_subodb.Step(name=stpname, description=stp_descr,
domain=stp_dom, timePeriod=stp_tp,
totalTime=stp_tt,
previousStepName=stp_pstp)
for (frm_id, frm_incr, frm_val, frm_descr) in frame_data:
# Creating a frame
iframe = istep.Frame(incrementNumber=frm_incr, frameValue=frm_val,
description=frm_descr)
# Creating Total Strain field output
ifield = iframe.FieldOutput(
name='E', type=TENSOR_3D_FULL,
description='Total Strain components',
componentLabels=('E11', 'E22', 'E33', 'E12', 'E13', 'E23'),
validInvariants=(MAX_PRINCIPAL, MIN_PRINCIPAL),
)
ifield.addData(position=NODAL, instance=subodb_inst,
labels=tuple(node_labels),
data=tuple(data[:, data_count, :]))
data_count += 1