Time filter optimisation #1540
ndevilleBE
started this conversation in
General
Replies: 1 comment 2 replies
-
Some information about this is on the Performance tuning page.
Also check when the last time was that the tables were analyzed, since that is often the cause for PostgreSQL not using the indexes one expects it to use. You can check the vacuum and analyse state using this query:
|
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Dear,
I'm using the dockerized version 2.0.4 of the Sensorthings server with a PostgreSQL 14 database.
I'm trying to optimize the time of filtering options and I came to a weird behaviour.
It seems that when the filter 'overlaps' is combined with other options (e.g. orderby), it slows the answer, while it should be faster as the amount of data to process is reduced.
For instance:
https://sensors.naturalsciences.be/sta/v1.1/Datastreams(5)/Observations?$orderby=result%20asc
-> response time = 476ms with "@iot.count": 247207
https://sensors.naturalsciences.be/sta/v1.1/Datastreams(5)/Observations?$filter=overlaps(phenomenonTime,%202020-09-01T00:00:00Z/2021-01-01T00:00:00Z)&$orderby=result%20asc
-> response time = 2585 ms with "@iot.count": 1761
I need to optimize at least two requests:
1.1) filter by result
1.2) filter by bounding box
The current indexes for the Observation table I have are:
Any suggestions on the possible optimization steps I could implement?
Many thanks for your help
Beta Was this translation helpful? Give feedback.
All reactions