• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Because I'm a Geek...Fluid Dynamics Simulation...Pebble in a Pond

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

JrClocker

AKA: JrMiyagi
Joined
Sep 25, 2015
I just finished writing a simulation program...fluid dynamics model differential equation...lossy wave equation.

And, because I'm a geek - I coded up a DirectX 9 3D graphing engine. I chose DirectX 9 as I like programming in C# (always hated C++, like C). unlike C++, C# uses a managed memory model...less crap to worry about. DirectX 9 was the last managed memory version of Direct X. (There are other "shells" out there that wrap DirectX 10 and 11...learning DirectX 9 was bad enough!).

Time = 0.000 sec...pebble dropped:

Pebble0.jpg


Time = 0.005 sec

Pebble1.jpg


Time = 0.010 sec

Pebble2.jpg


Time = 0.015 sec

Pebble3.jpg


Time = 0.020 sec

Pebble4.jpg
 
Wave striking a concrete slab...

Time = 0.000 sec...wave disturbance

Pebble5.jpg


Time = 0.010 sec...wave about to hit the slab!

Pebble6.jpg


Time = 0.012 sec...bam!

Pebble7.jpg


Time - 0.018 sec...reflections back

Pebble8.jpg


Time = 0.025 sec...just passing through

Pebble9.jpg
 
very good, now go play with foam. cfd is so much fun.

Whoa...never saw that before...looks pretty cool. I'll have to play around with it and see if it will do what I need it to do.

I have an idea for an underwater SONAR project...:D
 
Bah - Foam is Linux...I have been avoiding Linux as I already have too much to learn on my plate...

And it's C++...I hate C++
 
Oh - I can simulate lots more...what would you like to see?

I'm porting to a 64-bit application right now. It's easy for the main simulation code (just set the compiler switch), but it's cumbersome for the graphics engine.

I like (and prefer) C#...it's a managed memory model.

The last DirectX version Microsoft supported for managed memory was DirectX 9.0 (dorks...)

DirectX 9.0 doesn't support 64-Bit. There are some open-source projects which take un-managed memory models of DirectX 9, 10, and 11, put function wrappers on them, and bring it into a managed C# space. Playing with the 64-Bit DirectX 9 version now.

I looked into DirectX 11 (why bother with 10 - hehe)...it's quite a change from DirectX 9...will have to get up the desire to learn that and port my graphics engine over.

(Before I went down the path of learning DirectX 9, I wrote my own 3-D engine using non-DirectX .Net 2-D calls. It was fun to write (brought back memories of Matrix Algebra in grad school), but was slow - as you would expect.) I can share my non-DirectX and DirectX 9 graphing modules if you are interested...you instantiate the class, give it a window handle, pass in the 3D data points, and BOING it plots it for you with a default camera view. You can adjust the camera, etc. with code.)
 
I know what you mean on the learning plate, I do quite a bit of cfd work, I design and build turbo and supercharged intake systems and on top of that I am learning Arnold/maya/nuke, what fun.........
 
I started out my career with Plasma Physics...my first simulation was written back in the early 1990s in Turbo Pascal (does that date me or what...hehe). The first part solved Maxwell's equations, the second part computed ion densities, the third part ran a diffusion model, and the last part ran a fluid dynamics model for the plasma. It kept cycling until the self computed error converged. Took on average about 4 hours to run on a 386SX with a math coprocessor!

I've written simulations for optics, lasers, fiber optics, RADAR, and outdoor acoustics. Now, I just do it for fun...want to play with SONAR!
 
I always wondered, as far as simulations go, how do you verify your simulation is accurate, or the degree to which it is accurate?
 
Now imagine, instead of coding in C# you had done this in CUDA or OpenCL.

Generally for performance application you want to go C++ (not strictly talking about CUDA here).

But I get it, It takes twice as long to write the same functionality in C# than in C++ and it is really easy to get things wrong if you don't know what you are doing, although you will pass out on some neat optimizations.

And C++ looks much more ELITE :p
 
I always wondered, as far as simulations go, how do you verify your simulation is accurate, or the degree to which it is accurate?

That's a good question...I'll try to give you a good answer! :D

With simulations, you start with some equation of engineering or physics. In this case, 2-D wave equation:

waveeq.jpg

Where:
p = wavefront pressure
t = time
C0 = speed of wave in free space
µ = function of space, describing characteristic of the space
H(t) = forcing function
DEL (upside down triangle) = the partial derivative in space (in this case X and Y...only 2 dimensions simulated)

So, I am trying to simulate the wavefront pressure. Once I have this value, I can calculate a bunch of other things (as pressure is the mechanism that does "work").

To create a numerical approximation for this equation you start with a spatial (and time) grid. For this simulation model, I chose a fixed spacing grid (to keep it simple)...you can also do a variable spacing grind...that involves more calculus...

To impose onto a grid, you make an approximation for the space and time derivatives, and have the simulation compute the result. If you did your calculus and algebra correctly, the simulation will converge onto a solution. (In other words, it will not "blow up" but will show a result that is consistent with itself.)

Finally, you impose a "standard" problem into the simulation. Differential equations have common solutions to simple setups and boundary conditions. You plug these into your simulation and make sure that your simulation produces a result that is consistent with the common solution. You can spend a PhD thesis creating a non-numeric solution for non-common solutions...but that's why we do numerical simulations!

I hope I answered your question! :thup:
 
Now imagine, instead of coding in C# you had done this in CUDA or OpenCL.

Generally for performance application you want to go C++ (not strictly talking about CUDA here).

But I get it, It takes twice as long to write the same functionality in C# than in C++ and it is really easy to get things wrong if you don't know what you are doing, although you will pass out on some neat optimizations.

And C++ looks much more ELITE :p

CUDA or OpenCL...agree with you on that one (on my list of "things" to learn - hehe).

I do disagree that C++ is more efficient than C#. In the Microsoft environment, both languages compile down to a pseudo-code that interfaces with the .Net libraries. The .Net libraries do all of the heavy lifting...so, you go on thinking that C++ is faster without providing any proof.

I personally think C# is a cleaner language and easier to write in than C++. Yes I very proficient in C++ (I've been programming for over 35 years). Usually, folks have a difficult time learning the .Net framework. Once you have learned that, C#, C++, Visual Basic...shrug...just a preference as the heavy lifting is done with the .Net libraries. You keep coding in C++ and I'll be sipping a glass of wine as I watch you chase down your memory leaks (you don't get that with the managed memory model in C#). :thup:

And, C# looks much more awesome! :D

(Your name wouldn't happen to be Mark...would it?)
 
Well I am definitely younger than you, and my name isn't mark <.<

I am guessing you like me are not the kind of guy that wants to mess with pointers and stuff like that, But I have seen some scenarios where c++ is marginally faster than c#. A quick search on google will show you what I mean. There are also some cases where managed is even faster than unmanaged, but then I could argue that with some proper tuning you could eventually reach the same performance in c++ but with 10x the headaches.

I can't say I am an expert, but I know a friend or two working for a softwarehouse and they do still use c++ for the graphics engine and they tried to explain me the reasons behind it, the main being the lack of support from microsoft in DX with managed memory languages, but also because it gives you much more potential for optimizations.

Microsoft tried to get a game development framework for game development, it was called XNA and it used c#. It now disappeared, and this says a lot of what the industry preferences are. (And it was buggy as hell, the event manager sometimes wouldn't unregister events and we had to build our own . . .)

I totally get what you are saying though, and I think you got the wrong impression.
I also totally loathe c++, it scares me, and I have been fortunate enough to not get into it enough in my carreer to spend countless hours chasing memory leaks and opening 2 freaking files for 1 class only or to unravel the mysteries of why it would take 2 hours to build a simple project because some troll decided to include every damn class in each class.

Or the same troll defining "true" as a random between 0 , 1 in the preprocessor (although it was fun)

And as a last note, regarding memory leaks, we get them in c# too. Sometimes I think it is better to be forced in fixing something than develop bad habits.
 
As far as C++ being faster than C#...I have not seen any within the Microsoft tools. They both compile to pseudo code and run the same .Net libraries. If I want to do the crazy pointer supposedly high speed stuff with C# I can...just bound the code it with the "unsafe" directive. But that voids the reason for using a managed memory model.

In my humble (but accurate) opinion good coding practice rules out over everything. If you need super fast speed, use the programming language and the .Net tools to get there. (Look up parallel.for and parallel.foreach if you have not ever used).

XNA sucked...it was a nice try...but it sucked...like trying to force me to use WFP when I don't need WPF...

Your friend is correct in the Microsoft no longer offers support for managed memory languages for Direct X (read the 2nd paragraph of my first post!) Don't know the reasons why...it's Microsoft...they do stuff because they want to. However, you can get 3rd party wrappers which create C# classes, decorate them, and make the C++ function calls into DirectX 10 and 11. I have used these for Direct X 9 to create a 64-bit application (not supported directly by Direct X 9) and they work great. However, the name spaces change...and had to do some shoe-horning in.

DirectX 11 is on my plate of "learn this someday"...once I find a reason to learn it! :thup:

The funnies code comments I have seen...to pulse a watchdog:
- "Kick the dog"
- "Tickle my Elmo"

Here is the dumbest comment I ever saw in code (fired this guy)
- "I don't know what this code does, but if I remove it or move it things don't work"

By the way...good programming practices, programming standards, variable naming standards, and code comment standards go a LONG way to making the software good. I do not believe in "self documenting code" (i.e. I write it with great variable names and people will figure it out.) If I have to figure out what you are doing, you need to add comments or write cleaner code! :thup:

Thanks for feedback...good discussion. :D
 
Back