RECMGMT-L Archives

Records Management

RECMGMT-L@LISTSERV.IGGURU.US

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
John Lovejoy <[log in to unmask]>
Reply To:
Records Management Program <[log in to unmask]>
Date:
Mon, 1 Feb 2010 11:09:24 +1100
Content-Type:
text/plain
Parts/Attachments:
text/plain (64 lines)
UNOFFICIAL
Jim

The National Archives of Australia has an operational Digital Archive.
As far as I am aware, there is no "documented standard best practice" on
how often files should be checked.

Currently, we check our files for integrity constantly. We use a tool
called "Rolling Checksum Checker" which starts at one end of the
filesystem, reads each file in turn, calculates its checksum and
compares that with the checksum generated when the file was ingested.
When all the files are done, we start again.

If a problem is found, the administrator receives an email.

Because there is no other load on the file server, checking continually
does not affect performance.

Rolling Checksum Checker is part of our "Digital Preservation Recorder"
- More information is available at http://dpr.sourceforge.net/. The
open-source source code for the checksum checker is available from that
website.

John Lovejoy
[log in to unmask]
Disclaimer: I work for them (in the digital archive sphere) but I do not
always talk for them

-----Original Message-----
From: Records Management Program [mailto:[log in to unmask]] On
Behalf Of Jim Mullen
Sent: Monday, February 01, 2010 3:15 AM
To: [log in to unmask]
Subject: [RM] Long term e-records review

Good Morning All,

Question for the masses along with my 2nd cup o'joe.

In a long-term electronic records storage environment where retentions
can be 50 years or more for large portions of the data and where said
records have been moved from the "production environment" to an
"archive" server, is there a documented standard, best practice, or
recommendation from a recognized authority on how often those records
should be reviewed for integrity and any other management purposes?

Our "production" repository repository for said data is approach 20
terrabytes and as I'm writing the instructions on how to use the chosen
tool to archive to a near-line repository, the above question bubbled to
the surface.

Thanks in advance.

Jim Mullen
Wichita, KS


UNOFFICIAL

List archives at http://lists.ufl.edu/archives/recmgmt-l.html
Contact [log in to unmask] for assistance
To unsubscribe from this list, click the below link. If not already present, place UNSUBSCRIBE RECMGMT-L or UNSUB RECMGMT-L in the body of the message.
mailto:[log in to unmask]

ATOM RSS1 RSS2