Content-Transfer-Encoding: |
quoted-printable |
Sender: |
|
Subject: |
|
From: |
|
Date: |
Tue, 24 Jan 2017 18:20:50 +0000 |
Content-Type: |
text/plain; charset="us-ascii" |
MIME-Version: |
1.0 |
Reply-To: |
|
Parts/Attachments: |
|
|
All,
Are any of you willing to share any requirements or useful information on utilizing a crawler in lieu of doing a traditional inventory to prep for system implementation? I have heard both pros and cons but I would like to hear some feedback/ lessons learned in implementation, etc. from those who have successfully done this. All of the systems I have set up in the past required manual inventory gathering and then structure, taxonomy, metadata build out etc. after the inventory was complete and analyzed and control schedules created. This is a new thing for me and any guidance is appreciated. I have used crawlers in eDiscovery but never for inventory purposes. The end goal being to have information captured in some way that would be similar to inventorying, not just knowing where unstructured data resides. I know most tools cannot attribute meta tagging in this process which also presents some issues in gather data like that found in a traditional inventory. And lastly how do you audit the effectiveness and what is the absolute lowest user score you would accept that would not require a manual review?
Much thanks,
adh
Amy Harrelson
Records Analyst
Austin Energy
721 Barton Springs Rd. Austin, TX 78704
512.322.6283
[log in to unmask]<mailto:[log in to unmask]>
Amy Harrelson
Records Analyst
Austin Energy
721 Barton Springs Rd. Austin, TX 78704
512.322.6283
List archives at http://lists.ufl.edu/archives/recmgmt-l.html
Contact [log in to unmask] for assistance
To unsubscribe from this list, click the below link. If not already present, place UNSUBSCRIBE RECMGMT-L or UNSUB RECMGMT-L in the body of the message.
mailto:[log in to unmask]
|
|
|