I work for an architecture firm in NYC where we generate lots of large files. Files from various 3d, CAD and imaging applications. In this environment is is hard to manage the storage. Why? Well here is why.
When a project starts a folder is created in on the file server. This folder has a name or number associated with it to identify the project. For the life of the project everything related to it is stored here (accept emails). These projects last years. I really mean years. Well if you think about it how long does it take to build lets say (for example) a Dam or a Skyscraper. This is how long these files have to be accessible on the network. If the project is active it has to be on the production server. If it's on hold or wrapped up it goes to the archive server. These project folders get to be well over 100GB and that is only 1 of many projects.
Why can't I just delete old files?
That's not my job to be honest. Sounds like a don't care attitude right? Wrong! If I went the lengths to delete every old file even though they are on tape backup I will be restoring files EVERY DAY all day long. It should not fall on my to be in charge of what gets deleted and what doesn't that should be the team that is in charge of the projects job. I only provide the means for them to store their files and work without problems. Nice cop out right ;)
So I and my team of admins have been faced with this problem of the servers filling up year after year. How we use to deal with it was throw disk at the server. At one point we had about 5 files servers of various sizes filling up. One year I thought I was in the clear. We had a 400GB volume on our main file server and it filled up. We purchased an 800GB drive cage for an HP server. So I think to myself and tell my boss we are in the clear for the next 2 years. Well 6 months go by and the 800GB is down to 100GB free. We move inactive projects off to free up space and this goes on for a few more months. We then double that capacity. At the time 800GB was a hell of a lot of space for our size company. When we filled it up I was as shocked as anyone else would be. So now we had 1.6TB. Again we filled that up in 18 months. So in 2 years we ran through over 2 TB (if you include the projects we pulled off to make space). We decide to get an emc CX300 with 2TB production and 4TB for archive. This working out OK for since we got it but we still are running out a space.
We are currently looking for a hierarchical storage management (HSM) solution that integrates well in our environment. No it's not easy to just go get IBM Tivoli, CommVault or Veritas Enterprise Vault <--- (we have this for email and what a pain in the ass it is to set up). We have to make sure that whatever pointer file is left behind can be read by our CAD software. The problem is that our CAD software uses what is called an Xref. An Xref is a bunch of files that are accossiated with the main file you are working on. If I open file building1234.dwg it can call dozens of other files and they will all open due to the Xref. This is the root of the storage problem and the cause behind why a solution isn't easy to find. Say we use an HSM solution to move files older than 30 days to an archive spot, leaving a pointer in place and that file moved is a part of an Xref. If my CAD software cannot read that pointer file we can potentially corrupt the main file and delay a project. I have been explaining to these vendors this situation giving the same example asking them to find out if these HSM products will work with CAD software. Ofcourse they don't test nothing and say yeah it should work so I can buy it. Well if the software costed $30 I would buy it and try it but they cost thousands of dollars. We all know these software works on word files and all the common stuff that you find in the financails, law and medical firms but the architecture firms are always left out. It's like no one knows about us. Maybe we should stop designing buildings.
No comments:
Post a Comment