how did my file get corrupted? (+ how often to back up)
Posted: Mon Oct 23, 2006 12:33 am
Script URL: http://www.timeforlight.com/ccount/index.php?
Version of script: 1.1
Hosting company: dreamhost
URL of phpinfo.php:
URL of session_test.php:
What terms did you try when SEARCHING for a solution:
Write your message below:
I've had the Click Counter installed for a relatively short amount of time- about two months. I only had seven links with a total of less than 500 clicks, and I just came home after a few days away and found that none of my links are working. I don't understand how the file could have corrupted itself.
Looking at it in my FTP program, the "clicks.txt" says it was last edited yesterday (while I was away.) The "ids.txt" file has a last edit date of sometime last week and only contains one character, so I assume it's corrupted too.
I'd like to better understand how this happened so I can prevent it in the future.
ps: how often should I back up the data? Is it only neccecary to backup when a new link is added, or should it need to be done more often?
pps: just had a thought. could robots crawling my site have overloaded the script and caused the loss of data?
Version of script: 1.1
Hosting company: dreamhost
URL of phpinfo.php:
URL of session_test.php:
What terms did you try when SEARCHING for a solution:
Write your message below:
I've had the Click Counter installed for a relatively short amount of time- about two months. I only had seven links with a total of less than 500 clicks, and I just came home after a few days away and found that none of my links are working. I don't understand how the file could have corrupted itself.
Looking at it in my FTP program, the "clicks.txt" says it was last edited yesterday (while I was away.) The "ids.txt" file has a last edit date of sometime last week and only contains one character, so I assume it's corrupted too.
I'd like to better understand how this happened so I can prevent it in the future.
ps: how often should I back up the data? Is it only neccecary to backup when a new link is added, or should it need to be done more often?
pps: just had a thought. could robots crawling my site have overloaded the script and caused the loss of data?