-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test how fast PGEN can consume the table #62
Comments
After having discussed this with @coretl We want to check how this affects the CPU usage of the PandA server during this process, and ensure that if we write points to PGEN at 1MHz that PCAP can capture them at 1MHz. |
@tomtrafford @coretl Here's the timing data for the panda sending instructions. Blue, simple put command to set a bit to 1, Red, Generating and sending lists of data to PGEN. You can also scrutinise the code here: https://github.com/adedamola-sode/panda_test_scripts_issue62. |
Please can you put the timer and the for loop around just the inner function (the Also, the PGEN script is doing multiple Puts to PGEN within a gather, this means they will happen in parallel, while we want to measure them in series. Please can you remove the gather and Finally, I'm not sure what the graph is showing, why does it drop to almost zero every other point? |
I'll make the changes and test further. It's 10 points of data for 10 iterations of the main loop, so for some reason, it's drastically little time every other function call? I'm not sure why this behaviour is there. It was weird, so I thought a graph would be good to show it being weird |
Excellent, that's the results I wanted to see, so with a few optimisations we can probably stream points at 1MHz to PGEN, which was the target. Have you checked that you can produce those points and see them captured in PCAP? |
First step of development for #61.
We would like to understand and test the current functionality of the PGEN table. We want to write a large PGEN table and document the metrics of how fast the table is consumed.
The text was updated successfully, but these errors were encountered: