Results 1 to 3 of 3

Thread: Read-only file system error

Hybrid View

  1. #1

    Default Read-only file system error

    Hi guys, would appriciate if you could help me with this one. We have our intranet/website hosted on a remote Linux ubuntu 2.6.17-11 server. I have logged onto the server as root through putty and was buzzy backing up and compressing a directory using
    tar czvf tarfile.tar.gz dir_to_tar
    command and half way through process I got a read-only file system error and all the file on the system seem to have been changed to read-only files.
    This I can fix with the
    mount -n -o remount /
    But every time when I upload a few files with my Ftp client(Filezilla) the same thing happens and I have to log in with PUTTY and run "mount -n -o remount" to fix the problem.
    It all started with me backing up that directory, it might have been because someone had a file open on the intranet or something.

  2. #2
    Join Date
    Sep 2006
    Posts
    210

    Default

    Time for fsck me thinks.

  3. #3
    Join Date
    Dec 2007
    Posts
    90

    Default

    Yes, That Linux distro defaults to remounting a filesystem in read-only if it encounters an error, which is what must be happening here.

    A filesystem check will be needed at the very least, but since it's the root filesystem, you may need to boot from a CD to make sure it can be done safely. / may be mounted as read-only at best (or even better, not mounted at all) if you want to fsck safely.
    Probably best to trigger the error or remount -o ro, then fsck.(or boot the server in single-user- or recovery-mode but AFAIK even then the root fs is already mounted read-write)

    A possible explanation for the behaviour is a latent error already present on that filesystem (caused by random user wickedness or maybe faulty hardware, I had this frequently on my home PC until I found out the BIOS clock setting for the PCI bus was too high, way over 33 MHz).

    Another possible explanation is that the file you're tar-gz-ing over to that fs is larger than the filesystem or server can support.
    File transfer and creation triggers error on the filesystem, Linux responds as designed and remounts it read-only.
    There have been some notorious ftp and tar (2GB and 2TB limits come to mind) issues like that with various Linux distros in the past though I thought they had been ironed out by now.

    Regardless of the cause, an fsck is very likely necessary, after which, if the problem happens again, you'll know that a server or filesystem setting (largefiles?) is preventing it from properly receiving and storing that tgz file for one reason or another.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •