Forum Replies Created
I think you have covered pretty much everything to secure the problem and even beyond that to detect if a device is behaving suspicious.
Also @Jeff ‘s suggestion on checking upload interval is important in limiting invalid data.
Unless anyone has anything else to add, I will propose this as solution, and standardise it in a specs document written for the next firmware release.
Indeed patching would automate the production of unique firmware binaries, without compiling the source code every time.
But there is also the approach presented here: https://www.uradmonitor.com/topic/automated-device-ids/
The eeprom can also be written without changing the firmware using a separate script, after main firmware is written.
Very good ideas, I find the one with the favourite station particularly useful.
I’d say yes, it seems a useful thing to have, although my experience with Wiki is limited.
highcarts.com looks interesting, I will check it out in more detail.
@vinz, would you post the link to your implementation here?
I love this idea. Clean and efficient!
Yes, this makes sense. Regarding 2) what would be the memory requirements on the linux box?
- This reply was modified 8 years, 10 months ago by uRADMonitor.
The first uRADMonitor devices used the mega168, but the project grew in complexity so the upgrade was natural.
I was wondering if it would be possible to integrate OTA, using an exotic technique: having the code run in the first 16KB section of flash (offset 0), download the new firmware in the second section (offset 16K), than move new firmware block to offset 0 when download is ready. This might require a bootloader. I wonder if its possible to do it without one?
Anyone has any experience with this?
With the current hardware, OTA updates are impossible. Those interested in updating will have to do it manually. It is not complicated at all, but requires physical access to device’s PCB board and a USBAsp programmer
A cron script to compute average/month, ran once per month?
Probably this can be taken further: the cron script would compute the average for last month, and move all previous month rows to “archive” db.
However some people would like to “go back in time” and analyse data: say there is a spike that happened two months ago on a given station. One might want to check the data including “zooming” to see max resolution data to understand the phenomenon better. Then move to other stations nearby to check for the same thing.
I wonder how this scenario could be implemented efficiently.
There is one row/device/minute added to the db constantly.
I think one useful feature will be to automatically show on app startup all stations close to user’s position, so the app would be as informative as possible. Also stats on radiation trends and a few lines on exposure limits and their effects, as published in current literature.
Email alerts or push notifications should be configured on the website, and delivered to the mobile apps, for events like: major changes in background radiation levels at some location, preconfigured threshold reached, etc.
The ui should take advantage of modern UI/UX
I think this idea was first formulated by Peter ( https://www.uradmonitor.com/?open=11000026 ) . What he said was to add an overlay with the nuclear plants on the uRADMonitor map, with a descriptive icon. To this I would add the option of having a switch, to toggle such overlay on/off.
The good thing here is such information is publicly available: http://en.wikipedia.org/wiki/List_of_power_stations_in_Germany (this is an example for Germany).
We only need to build a list with Name, country, city and GPS coordinates and this can be easily integrated.
ok, consider it done.
There was some very good progress with an updated firmware which opens port 80 on the unit, so you can connect to it via your browser in the local network. As soon as it is ready and it passes all stability tests, it will be released and available for update. The idea with this is to break the dependancy to the central server: in case something bad happens and the server goes down, you’ll be able to access the unit’s measurements directly in your local network, using only the Internet Browser.
So if I understand it correctly, this could be packed into the following scenario:
1) By default program the units with a default mac, a default ID and uradmonitor.com as single server
2) When the unit connects, it would download a freshly allocated mac/device ID and a list of alternative servers and store the data in its EEPROM.
3) Periodically, the unit would check if other alternative servers are available (and delete those that are offline).
Would this work? Does it cover everything, enough to make this decentralisation stand?
I believe the logic will require only a minimal code, that can fit in the available flash memory. The only issue is I find multiple transmissions of the same data to multiple servers a bit redundant (not a big issue though).