Vig What we do is this -
Mash all of the data into one comma delimited file, the first record containin column headings.
This obviously means you are denormalising your data.
We use this big CSV file as our mast file of Inventory, updating it daily, comparing yesterdays file against todays to track line by line changes (we just use LC and pip the output to a daily delta file).
The big CSV file can then be processed by command line stuff - like a FINDSTR to get a list of all sw on a given workstation name or a piece of software.
Or we can just load the file into Excel (calling it .CSV and having commas in it lets Excel pull the file in without any manual parsing dialogs) then we can use autofilters to do excatly what you are suggesting - each column heading becomes a pull down list, you can click on SW and get a list of all software products in the file, select one of them and filter the list by this product, likewise workstation name, likewise OS, memory size etc.
You can write your own filters to get memory less than 64 Mb etc.
The masterfile can become very large though (number of workstationsXaverage number of software products installed) so we often use the command line utils to filter the list down before we eyeball it, often we would use a small batch file to search a file for whatever, pipe the output into a another extract file, and then open it in excel.
The delta by day report is simple to produce and very useful - we only keep the latest days master report, and then a small delta file for each execution of the inventory. At one time we were processing the delta file and emailing some extracted information using blat.
Anyhow - not so elegant but it works fine, Hope that helps
Mark
|