RECMGMT-L Archives

Records Management

RECMGMT-L@LISTSERV.IGGURU.US

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Larry Medina <[log in to unmask]>
Reply To:
Records Management Program <[log in to unmask]>
Date:
Thu, 1 Dec 2011 11:07:10 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (60 lines)
Interesting reading all the opinions/comments on this topic- granted we have
some insights into the "problem" and many of us have "potential solutions"
to offer, but we aren't intimately familiar with the volume, the corporate
culture that has led to this and how this functional unit within the
organization can best participate in curing the cause.  

As with many situations, I'm assuming it's kinda like this:  http://goo.gl/ceiPm

If the unit performing the work to "fix" the problem data is funded out of
overhead (as an administrative expense of business) then essentially all of
the other units are paying for these corrections to be made.  And this may
have two benefits; the labor costs for this unit may be lower than that in
other units AND there may be a guarantee of consistent application of
effort, resulting in normalized data that is easier to locate after the fact.

IF NOT, then maybe the unit should track the effort/cost associated with
fixing the input from other units and suggest to their management a 'charge
back' to cover the cost required to re-do work that was done incorrectly or
improperly by others initially. Maybe if the management of these other units
starts seeing a line entry cost on a monthly basis for what it takes to
correct their errors, they will make some changes internally.

Even if there ISN'T a formal charge back, gathering data to show where the
effort is being expended and who the gross offenders are would potentially
allow SOMEONE at SOME LEVEL to have a conversation with the other units.

As for the issue of verifying/validating each and every entry, that does
seem excessive.  Naturally, it depends on the value of the resulting entries
to the organization- but someone has to perform a cost/benefit analysis to
determine if the benefit of item by item review to ensure 100% accuracy
outweighs the cost.

Typically, in a QC effort, a threshold is set for review (initially 100%)
then it tapers off as there is greater confidence in the quality of the data
being entered, 90%, 80% etc until errors start rising again... then it
either goes up for ALL work, or for work being performed by individuals the
errors can be tied back to. 

A common method of doing this is to "dump the data" into a spreadsheet and
review columns for inconsistencies (data out of range, alpha characters
appearing where data should be numeric, extra spaces, etc) This type of data
normalization review is a common practice for QC of data entry.

As for the issues of errors in the initial data, one way to attempt to
improve upon that is to change the data entry for fields with "fixed" values
to have pull-down menus that users have to select an item from to populate
the fields. And by working with specific units, those pull down menus can be
made smaller to include ONLY those items that would pertain to what they do,
thereby making errors less common and speeding up the entry of data as well.

Just another set of comments/opinions.

Larry
[log in to unmask]

List archives at http://lists.ufl.edu/archives/recmgmt-l.html
Contact [log in to unmask] for assistance
To unsubscribe from this list, click the below link. If not already present, place UNSUBSCRIBE RECMGMT-L or UNSUB RECMGMT-L in the body of the message.
mailto:[log in to unmask]

ATOM RSS1 RSS2