January 18, 2018
I have a water level feed that is subject to receiving false positives, and would like to delete several hundred entries from the THingSpeak data stored.
Was looking for the easiest method.
Tried downloading the file CSV, deleted entries, and re-uploading but the process fails as it says timestamp already present.
Tried by modifying the contents of the CSV with correct values (Replacing the incorrect values) and the same error occurred.
Does one have to delete the entire database and re-upload it for the csv upload to function properly, or is it just possible to modify the false positive values in another fashion ?
Would be nice if there was a simple spreadsheet like editor where the values could simply be deleted or replaced.
March 7, 2017
Unfortunately, you will have to delete all the channel data to re-upload points with the same timestamp. I would recommend setting up a MATLAB analysis to filter your data as it comes in and write the filtered data to another channel. You can use the excel sheet with the cleaned values to start the new channel. You can use TimeControl to call the MATLAB analysis at regular intervals, and process a collection of the data each time segment.
I have felt your pain about wanting to delete specific entries in a channel before. I will note your request in our tracking system, thanks for the feedback.
January 18, 2018
Thank you for your informative response and recommendations...
As nothing in life is easy here are a few more variables I need to deal with.
1) The data feed is in reality being used in real time to support knowledge about water levels for a local community. (Free)
2) The number of feeds currently active exceeds Thingspeak free level, and is operating under the grandfather clause.
3) Would like confirmation if the entire database were to be deleted that it would not prevent the continuiation of the grandfather clause.
4) Setting up an additional feed (Filtered) not feasable as
i) Max Feed Count exceeded
ii) False Positives are random based on local weather variables causing water infiltration (Pressure Transducer) during daytime melt conditions and subsequent freezing (Expansion/triggering) due to gound temp nighttime conditions. (Working on a solar based heating "defrost" but not available for this year)
From the sound of it if I do a complete channel clear and re-upload will require
a) complete download (Archive)
b) Merging Downloaded data with (Historical Pre-Cleansed Data) - within a 15 min timeframe (To avoid additional channel data upload data loss)
Any Constraints on a free account to upload approx. 30,000 entries ?
Sound like a viable, workable plan ?
Note: No room for failure as freshet flood period is "any day now"....
Very Nervous ---Risk may exceed potentia benefit....
March 7, 2017
Sounds like a very interesting project.
The grandfather clause is for the number of channels. Depending on your workflow, you may be able to condense data into more fields, and use fewer channels.
If you clear all the data in a channel, it will not effect the grandfather status for the number of channels in use. If you delete the channel, it WILL change the status with respect to the new channel limit policy.
With respect to point 4)ii) Why are you worried about historical false positives if they are random and are actually caused by real data? Can you explain again why you need to remove false positives from the historical record?
To solve concern b), I suggest you borrow a channel from a friend, and write the data you want to fix to that channel. Then you can make sure the process works, and just add new data as it comes in to that channel. Then when you are ready to do the whole switch, you can be sure it will work fine.
If you do decide to copy all 30 k entries, you will be limited to 960 messages per update. You will need at least 15 seconds between requests. If you estimate 20 seconds, you can complete this in 10 minutes. You cannot use a single MATLAB analysis with a loop, as the compute time is limited. You could manually press the run button more frequently. You could use a time control to run the job every 5 minutes, and the process would complete in a few days. Keep in mind that this will also consume messages, but you have 3M messages for the year. I recommend thingSpeakRead() and thingSpeakWrite() over the bulk write API as the syntax is a bit easier.
Until I understand your workflow better, it seems that creating a derived channel for the new data and processing it is your best option. You may also be able to purchase a license to increase the number of messages per write and the number of channels allowed. I hope the weather cooperates with you!
Most Users Ever Online: 166
Currently Browsing this Page:
Guest Posters: 1
Newest Members:ssh47k, aaceitunog01, AlinaCutle, MARTO, simplicity1, jango6954
Moderators: cstapels: 460
Administrators: Hans: 405, lee: 457