Using Exodus sizing function with Trelis 17.1 or later

I received a link to set up an account on the Coreform Service Desk portal, which I have done. Do I need to submit a request from there?

Thanks,
Charles

Charles,

No - we went ahead and put in your ticket some time ago. Apparently, we’d been replying to the ticket, but we’d failed to add an account for you – so you’ve not been receiving our responses!

– Greg

Aha. I don’t see anything on there when I login. Do I need to do something else?

Thanks,
Charles

After looking at the files you sent it appears that the problem was just a misunderstanding on my part on what the dimensions of the sizing array should be. I had assumed a single array that could potentially hold several variables:

double vals_nod_var(time_step, num_nod_var, num_nodes)

I should have used a separate array for each variable (assuming there are more than 1):

double vals_nod_var1(time_step, num_nodes)

That seems to have fixed my problem. I also note that the files you are using are a much older API (5.1f vs. 8.03f), and that the floating points values are all declared as float instead of double, but this doesn’t seem to make a difference.

Thanks again for your help.

Cheers,
Charles

Question: where in the documentation could we clarify the misunderstanding Charles had regarding how arrays function in this case? It seems a brief example would be useful. @karl @gvernon

The documentation is in Exodus. However, the way we are using Exodus may be a legacy implementation. It is still documented and still valid. My guess is that we are not supporting some versions of the latest API’s to read field/gradient data out of Exodus.

A small example would definitely be useful. My problems arose because the method I was using worked with older versions of Trelis if I used small Exodus format. I now think that was just a fluke, and I should have been using this method all along. A simple example could include:

  1. The journal file to create the initial mesh.
  2. A very simple Python script to add the sizing function field.
  3. A final journal file that makes use of the sizing function information.

This might be a bit much for user documentation, but could you provide a link to such an example from the manual?

Thanks,
Charles

p.s. The Exodus documentation is very extensive, but difficult to find what you’re looking for.

Hi Charles,

I got some feedback on this problem from Dassault who now owns the MeshGems library. There was a change in the adaptive meshing library to use double precision values that led to a need for increased memory. There is no user modifiable method for changing the amount of memory. They also added some automatic memory management that may resolve the issue. I will make this a high priority bug and see if we can get a fix that you can test soon.

My surface adaptivity took about 8 hours. Dassault indicated that this seemed too long. They recommended adjusting the initial mesh size to be closer to the adaptive size. I will look at the problem some more to determine what is possible.

Thanks very much, Karl! I really appreciate your work on this. Please let me know if there’s anything I can do from this end.

One thing I’ve noticed about surface meshing for complex surfaces (in addition to extremely long meshing times) is that any calculation that involves computing the condition number (which includes a number of quality measures) is extremely slow. Just evaluating the surface mesh quality can take an hour or more. I’m not sure why this is, and I don’t recall this happening in previous versions of Trelis, although it may have been true then, too. I don’t think that condition number evaluations for volume meshes take nearly as long.

Cheers,
Charles