It is worth remembering that the most common way to loose data is - user error. This includes things like: Accidentally deleting the data or backups of the data. Incorrectly configuring the software that is meant to look after the data (backups). Accidents like dropping your hard drive.
Then there is software and hardware issues. Eg disk failure or software failures that aren't a user error.
Finally there is malicious intent. This is probably the smallest reason and can be virtually eliminated by the right software and keeping systems up to data.

I'll tell you what I do. I have a lot of data, around 20TB.
I separate this into three categories - still pictures, pictures for time-lapse and video.
I also separate it into RAW or unprocessed images, and processed images.
The RAW images I archive, and I keep two copies. Some of the disks are quite old now, but I don't try to consolidate them, I just store them. By this stage I know they have the data ok and I don't want to introduce potential problems by trying to consolidate them (user errors).
I keep a historical copy of RAW + processed images on 8TB disks for easy access. I currently have two of these, 1 for time-lapse and one for stills and video. This is consolidated data and I will delete useless data when I see it - hence it is open to user error, but I always have the offline archive (2 copies).
I have my current data : working files which include all most final images or videos, but only the latest RAW files, on a RAID disk array. This allows me to easily access all my data plus I always have two offline copies of the RAW data.
This is all possibly overkill for someone with less than 1TB of photos as a couple of 4TB backups would suffice. Maybe it doesn't matter so much if you lose data, but at least you should have made a conscious decision on that.

Oh yes, I also keep my system up to date and have current anti virus software.

I remember working for a large IT site many years ago and they lost a large amount of data when someone didn't quite understand how to program SAS. They wanted to do a nightly cleanup by deleting all temporary files that weren't allocated. Unfortunately SAS uses a slightly reverse type of logic and they managed to write a program to delete all non-temporary files that weren't allocated. Tape files were included. The program took quite a while to run and the operators worked hard to get it to finish. The next day they discovered what they had done. All would have been recoverable if some programmers hadn't assumed that if a file was written to tape, then it was backed up (since tape was the backup medium in those days). They lost quite a lot of data and it was user error. Over the last 15 years I was there, we only once lost data due to hardware failure and even then, only a few hours worth. I think the moral of the tale is - if you are actively working with data, you have a chance of destroying it through user error.