Continuing emulation conversation.. #32
Replies: 3 comments 3 replies
-
The more I work with your crate the more I wonder why it isn't more popular. |
Beta Was this translation helpful? Give feedback.
-
Virtual time is when time provided to Stakker is disconnected from actual clock time (real time). Stakker just trusts whatever time you give it, and if everything uses
This is good for running long tests, meaning that the test will only consume however much CPU time it needs, rather than doing all the sleeps. (At work we have tests that in real time would take hours, but running in virtual time take very much less. The code under test can't tell the difference, because time is fully virtualised, i.e. the code thinks that hours have passed, but perhaps it was only 2 minutes on the realtime clock.) For your purposes, I was suggesting running in virtual time, but limiting it to no faster than real time. So if the host CPU is fast enough, it will run the simulation in real time. But if you get scheduled out for a moment or the CPU is not fast enough, it will run slower than real time, but completely correctly (i.e. time acts consistently in the simulated environment). I hope you can see that by running virtual time, we're simulating a perfect environment where there are no unexpected skips in time. However if you use If you use the main loop I suggested then that provides the illusion of perfect time-keeping for the simulator, no matter what happens in the Windows environment (scheduling out, temporary high loads, etc). Then to make it go no faster than realtime, for the purpose of human interaction, the |
Beta Was this translation helpful? Give feedback.
-
Yes, that's right. |
Beta Was this translation helpful? Give feedback.
-
Continuing the conversation from here:
#31
Sorry I didn't realize there is a discussion section until you noted it.
I have a lofty dream of converting my 8088 emulator into a general "chip emulator" crate for tons of different types of chips. Since this would be used as a library, I didn't want to force the user to use any particular framework.
I spent a bit of time looking for alternative ways to declare the chips in a more general way, with a variety of ways.. Generators, reference counting, etc..
But after nonstop fighting the borrow checker and going through various annoying boilerplate code, I keep coming back to Stakker.. It might end up being required in order to use the chip crate.
Going over your comments:
I am using Windows. And
sleep
sucks for anything less than a millisecond. I haven't foundyield
to do much better. I don't fully understand, how does virtual time work? Would it be able to schedule nanosecond delays?You can split up the impl across multiple files, however I don't like this because it then becomes confusing, which file holds which methods? That is why I like this more declarative approach where you have to specify the file every time you want to use a method.
I started out with running all the chips in multiple threads with message passing and mutex locks. The biggest problem is sending out a request for input, and then waiting for a response. I timed, this took a good 500 nanoseconds, regardless of the library used. That is too long. When "booting up" the 8088 bios runs a test to make sure the timer chip is running at the correct speed, relative to the cpu. Regardless of my attempts, the test kept failing. After I switched to Stakker, the timing issues all magically went away.
Each 8088 tick needs to have a 210 nanosecond delay, which it successfully hits with Stakker. I have not tried to overclock this to find out maximum performance.
I am a little confused here.. Let's say I have a function:
I was under the impression that
cx.now()
would be the timerun
started. Is that not the case?The reason I wrote
cx.now() + duration
, is that the nextrun
must run after theduration
, independent of how much time actually progressed.If the code is taking too long to run, that it passed the
duration
, then that means I have reached beyond maximum possible performance. In this case, the emulation is running too slowly, and I hit the limit of what is possible to emulate.Does this make sense?
Beta Was this translation helpful? Give feedback.
All reactions