I am simulating the compression of a rubber block, for which it is initially being compressed a large amount (10 mm), and then displaced by a small amount (1 mm in steps of 0.1 mm each). I am trying to validate it against the strains observed in experiments and computed using DIC (digital image correlation - simply put, the block has certain markers on it which are tracked at each small displacement and ultimately used to compute the strains). Since DIC is typically not very good at tracking large displacements, I am comparing images between relatively small displacements (e.g., between 10.1 mm and 10.2 mm) to calculate the INSTANTANEOUS strains (i.e., strain at any particular step compared to a prior step, unlike LE22 which gives the strains compared to the initial step). The way I am doing this is by extracting the nodal displacements (U2) or deformed coordinates (COOR2) at two subsequent nodes in the Y direction, subtracting them to get the length of the element in the Y direction, and comparing these between two frames to get the instantaneous strain. I have the following questions:
- Is this the right approach to calculate strains? How does Abaqus calculate strains?
- Theoretically, the approaches using U2 and COOR2 should both give similar results. Is there a reason why they give slightly different results?
- Since this is a rubber block showing large deformations, I have ALE adaptive mesh enabled. Following the approach described above, the strain values show an oscillatory behavior (i.e. following the approach described above, for some elements, the distance between the 2 nodes in the Y direction is decreasing as expected, due to compression, while for some other elements, the distance between the 2 nodes is increasing) although the overall behavior is compressing and all elements should compress. Is this happening because of adaptive meshing?