-
First, thanks for a great dashboard! Been using it for a few months now on an Ubuntu setup. I wanted to add our local utility’s real-time pricing as well as API info from Enphase microinverters to the dashboard. I used telegraf.local to set that up, and it worked fine until I noticed that data was being purged after 3 days. Found that telegraf.conf sets the ‘raw’ RP as default (3-day retention), so I created a second output plugin in telegraf.local with a namepass for my data, sending it to the ‘autogen’ RP: Didn’t work (no data going to autogen, still goes to raw). Created a new RP ‘local’ in influxdb.sql (which I don’t like, as it will be deleted during dashboard upgrades) – modified the above to reflect, but data still only goes to ‘raw’ RP. If I manually trigger the telegraf container (e.g.: docker exec -t telegraf --config /etc/telegraf/telegraf.d/local.conf --once -debug) it DOES send data to the ‘local’ RP, so the config files seems OK. I’ve deleted and reinstalled the dashboard to see if that’s the problem, even reinstalled the OS. Disclosure: I’m a newbie to Grafana/Influx/Telegraf – I’ve wanted to learn them and have been using this project to do so. So I’m not sure if I’m doing something wrong, or the dashboard project is altering something I have not found yet that erases/ignores the second output plugin. Anyone have any ideas? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 5 replies
-
Update: I set up a manual Telegraf trigger as a cron task overnight (data to RP autogen), and noticed I was getting double entries this morning. commented out the cron task and the dashboard continued to update?!? Not sure the root cause of all this, but will continue to monitor. |
Beta Was this translation helpful? Give feedback.
-
Im not great, but maybe this can help you figure it out or get you in the right direction? Did you copy the telegraf.local 1:1? if so, that would be why you are getting duplicates when manually running it. You could try creating a telegraf.pricing and only use that for the pricing information, so its also not capturing all the other data. So strip out the parts getting powerwall and weather data. Also, which I think you already know, the default DB is on a short retention period of 3 days, so you could create an alternative specific for this data and no retention. so instead of to raw.powerwall you do something like pricing.powerwall and have no retention on it. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the reply. Yes, I was actually using the exact same telegraf.local, calling it directly alongside the running Dashboard via the docker exec command. Funny thing is before I set that up to be run via cron, nothing was going into the autogen retention policy, but after the cron setup, the dashboard started to send ALSO to autogen - in addition to the cron task - hence the duplicates. Stopped the cron, and as of now data is still being saved via the running dashboard. So ... it's working but I don't know why (always scary). I had originally created an additional retention policy, as you suggest, but it was not working either back then. I don't really like that solution, as without manual intervention it would get stepped on when the PW dashboard is upgraded. I really like the idea of the telegraf.local scheme. Keeps every thing together, and it's very easy to pull in other data from the main Dashboard to use with the pricing and the Enphase microinverter data. I'm going to set it up again from scratch on a sandbox PC and see if I can get to the root cause ... but probably not until I panic when it stops updating (smile). |
Beta Was this translation helpful? Give feedback.
You should remove your custom lines you added to
influxdb.sql
and put them in another file, perhapslocal.sql
. As to how upgrade.sh works, it does not revert any previous instructions, it just re-runs upgrade.sh just in case we add new lines (e.g. new retention policy) as part of the project. Anything you add locally will no be changed if you use a different name for the policies. Yourlocal.sql
will not be needed unless you reinstall everything from scratch (e.g. erase the influxdb data).