Our company was just acquired by one in TX. We are now in the position of having to keep a local and remote linux (Centos54) file servers synced up. We have an automated build process that spits out about a 4GB build every night or even more often at times. This is now being done on the corporate network. We have local test facilities that use these builds, as well as corporate facilities that need access to our local resources. It's all getting to be pretty messy. We are trying to keep the file servers synced up using rsync. Unfortunately sometime it takes a while for this huge amount of data to be synced up, or worse, someone makes a change to the local server's files, which gets overwritten on the next rsync. I was wondering if there was a way to create a common file system between both sites (here in MN and TX). GFS sounds like it might work, but I have not found anyone who claims to have done this on Google. I remember there use to be AFS which worked in a similar fashion. Guess I'm hoping to set something up where files will exist on both networks. When a file is opened, the network compares the local and remote file systems and the newest version is used. If the remote is the newest, it's transferred to the local as it is used so the local cache is updated and the next use will be entirely local. Am I dreaming? Anyone have any ideas? Thanks. --- Wayne Johnson, | There are two kinds of people: Those 3943 Penn Ave. N. | who say to God, "Thy will be done," Minneapolis, MN 55412-1908 | and those to whom God says, "All right, (612) 522-7003 | then, have it your way." --C.S. Lewis -------------- next part -------------- An HTML attachment was scrubbed... URL: http://mailman.mn-linux.org/pipermail/tclug-list/attachments/20100312/1c5b463f/attachment.htm