Managing Total Storage in Schools
by, 12th February 2014 at 03:21 PM (14963 Views)
One (of many) things that created discussion from my last blog post was the total amount of storage we provide to our users in school. Unlike some schools we do not limit quotas for user or shared areas but we do keep it very well maintained via automated and manual processes. Many people seemed to be surprised our total data store was only 500GB for a school with over 1,000 users. People mentioned figures such as 5tb to 20tb of storage which seemed very large to me. Some other network managers mentioned the cost of transferring all this data to solid state technologies and they have a point. It was also mentioned that the cost of storage has plummeted in recent times and why shouldn't we let users save what they want? Good question indeed..
I would argue though, that network managers need to start looking at reducing the total storage to make their systems more efficient AND stable. By this I mean, if you fix the quantity (and quality) of data then it becomes a lot easier to manage, easier to backup, and quicker to restore as well as making it much easier for users to actually find their data. So if you are interested in how you can do this then read on:
First of all lets talk about the advantages of limiting the total storage size:
- Our entire file server backup SSD to SSD over a gigabit switch takes 48 minutes.
- Accessing files is quick as they are all based on Solid State Memory with very low access times. When I say quick, I mean really quick, no bottlenecks, no delays, just instant access to any file on our network.
- The file system is more reliable, again as its based on solid state technologies.
- The quality of data is "better". By this I mean that things are easier to find and are more relevant to teaching.
We archive all files each summer over 5 years old. We make it very clear that this is an automatic process, BUT give all staff a chance to opt out via a simple reply to the email. The archiving is actually done with a Robocopy script as you can see below. Robocopy is available inside windows from a command prompt. Once the files are archived to the new directory we leave them there for 6 months, then move them onto another server with a non shared drive. Eventually after a few years we will archive these off to external hard disks and store in a secure area of the school.
Code:robocopy.exe "C:\Data\TeacherDataNew" "C:\Data\_Archive\TeacherDataNew" *.* /S /COPY:DATSO /MOVE /ZB /NP /XA:RHS /XF *.ini *.ico /MINAGE:1800 /R:2 /W:10
Separating Media Files from general storage
We actively encourage all users to save media files to a special media server. This is infact just another file server with a large samsung SSD but it does give nice segregation. This makes finding media files easy for all users. We provide links on every student, teacher and admin staffs desktop for Video and Audio files. This is backed up less frequently and is not part of our daily backup routine. The biggest downside to this method is actually they have no remote access from home to these files, this has caused small problems in the past but with the advent of things like sky/onedrive it is becoming less of a problem. If a user wants to watch a video it can be found from a simple link on the desktop easily.
Reducing the size of Photos
We re-sample all photos on the network to 1080p quality. Most digital photos copied to the network are around 6mp big. There is no need to store files this big on shared storage so we manually re-sample these images once a term to fit the biggest screen resolution we provide. This means they can be printed in full colour and be seen on screen perfectly. Typically this turns 4mb+ files into 0.2mb. To do the actual resizing process we use the windows powertool ImageReziser on codeplex site. Just load an explorer window and search in the top right for any *.jpg files that are over 1mb on the root of your network shares. Once its finished searching you can right click on all the files and resize to 1080p resolution. Make sure you do some tests first so you can check the permissions. We've found a bug in the most recent release that resets the permissions to only administrator, the older versions work fine though. I always overwrite the previous file and never up-sample in the options.
This is a thing for the future but I would like to run an internal youtube server at some point. This would reduce the size of video files stored on the network and let them be published online or to any computer in the school at a adaptable bitrate. Therefore reducing the need to store 1gb+ raw video files on the network.
Restricting file types
We use FRSM on windows server 2008 R2 to restrict students to saving executable files on the network. Our web server also scans and notifies when zip files are downloaded. We also set it up to email an admin when they are saving media files so we can manually remind them to move it to the media server. If they don't we move it for them after a few warnings.
Restricting folder Creation
A very effective way of maintaining the structure of a teacher shared drive is to make the root of the drive read only. This means they can't create new folders on the root and have to actively browse to their departments folders. I'm not going to micro manage what they put in their own shared spaces but this method does point them in the right direction.
Our backup solution does De-duplication automatically. A 500GB read can turn into a 380gb write Veeam. I thought this was worth mentioning to show how many similar files are used on the network.
Searching for multiple files
There are loads of free utilities to find duplicate files in a folder. Its well worth running it to identify these files. I wouldn't recommend the auto delete functions of these software, I just run them to find the files and manually look at them. Some good options here although I can't recomend a good one as some come with optional spyware installs.
How do I indentify and delete duplicate and files in Windows 7 and - Microsoft Community
Removing large files
We use a program called Windirstat on the main file server regularly to spot large files. These are often Zips or install packages that a teacher may have downloaded. We delete all install packages (as they are only used once and shouldnt really be there anyway) and advise teachers if there is a large file in their areas that might possibly be moved. Most people don't even know why the large file is in there area and its safe to archive or remove.
Remove specific file types
If you wanted to move (for example) all PSD files over 1 year old off the main network shares then just run this script. We do this once a year at the end of the term on student folders. This process does not delete files but it can move files into another folder (keeping the structure) to be moved offline somewhere. If a user ever needs it back a simple email is all that is required and a copy and paste from the live archive folder.
As always this wont suit everyone, but may give you some ideas on how to manage the ever increasing storage monster! Be careful when you start moving archiving files, I never delete a single file just move it to a location where it can be stored. Personally I have a live "achive" server that I move things to that stays there for months, if not years so its a simple case to restore if needed. After a few years it may be worth archiving these files to an external hard disk.Code:robocopy.exe "C:\Data" "C:\Data\_Archive\PSDs" *.psd /S /COPY:DATSO /MOVE /ZB /NP /XA:RHS /XF *.ini *.ico /MINAGE:365 /R:2 /W:10
Total Trackbacks 0