Heat transfer, mesh sensitivity analysis?

Hi
I need to do a grid sensitivity analysis and to find the best node size which gives the most accurate heat loss. But the problem is that I found decreasing the node size changes the heat loss but it never converges. I changed the code to the simplest case with constant temperature on the boundaries but still, I see the same problem.
(the domain is two dimentionl with attached boundary conditions.)
It would be appreciated if help me.
I have attached my code.

3 Comments

from what I can tell, you want to monitor the value of g as a function of mesh resolution, but it cannot be plotted because I think it is not indexed correctly. can you show what your [non-]convergence looks like? It's possible since you appear to be using first order derivatives that you just aren't getting high enough resolution to get into the convergence regime.
how are you assessing convergence? are you trying to fit a power law in mesh size?
also your code is inefficient - pay attention to teh warnings about preallocation and sparse indexing.
I even tested it with power law in mesh. But still the sam eissue. I atached the graph It would be appreciated to let me your idea
All my graphs are like this
that looks pretty convergent to me...the power law needs to be in 1/num_nodes

Sign in to comment.

Answers (0)

Categories

Find more on Mathematics in Help Center and File Exchange

Asked:

on 1 Jan 2021

Commented:

on 3 Jan 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!