Project Idea: Enhancing the Desktop Search

Dec 5, 2006 at 4:36 AM
Is it possible to make Enterprise Desktops more manageable by automatically collecting, organizing, and archiving existing files into a central repository (in line with the Enterprise Content Management Policy) and cleaning the desktops. The users would be provided with a list of documents where were picked up from their desktops which would be hyperlinked to their locations in the central repository.

There could be several benefits of such a system. Some of the benefits include:
1. Make life easier for the Network Administrator.
2. Prevent legal liabilities.
3. Protect intellectual property.
4. Secure and organized Enterprise content.
5. Increase visibility through the shared repository.
6. Would give stats about the rate at which the enterprise is creating new content.
7. Would prevent duplication of files and content.
8. Would help establish relationships between the documents and content parts.
9. Cleaner desktops with improved performance.

How would it happen?
With defined Content Management Policy, most of the enterprise documents would be created from Enterprise templates which would already have the necessary metadata associated with them to identify their content type.

The desktop search application would play an important role here. It would be more evolved and have more control over the users’ desktop. These applications already have the capability to crawl through the hard-disk and index the content on it. It can be enhanced to do the following:
1. Download the content-type list from the server.
2. Ensure that every new document that is create by the user is classified as one of the content types.
3. Associate necessary metadata with each file based on its association with other documents.
4. Perform file-system operations on the users’ disk depending upon the server instructions.

A central content-management server would initiate the crawling process at regular intervals (daily or weekly) and contact the desktop search indexers. It would communicate with the desktop search application and get the required indexing and metadata information. This information would be analyzed by the server and compared with the information available at the server. The server would then send the necessary instructions to the respective desktop search applications to take action. These actions could be custom defined and could include the following:
1. Collecting the new content from the desktops.
2. Identify duplicate content and take action on it.
3. Update the server search index.
4. Update audit and content creation reports.
5. Send alerts and notifications.
6. Automatically set in workflows.
7. Send a list of hyperlinked documents to the desktop application for documents that were removed from the desktop.

Comments are welcome.