Narayan is asking about experience people have had with rules extraction using current tools. I would like to share our experience in that regard and open it up to other experiences and insights.
We have used various tools (Seec, Becubic, Relativity, as examples). I have found that there are three very important (and therefore, challenging aspects, as follows:
(1)Archeology: To do the archeology step (registering artifacts into a tool) and generating functionality or reports that assist in finding where to start digging for rules and where not to dig. The most difficult part is estimating the digging, even when you obtain complexity metrics from the tool during archeology because most current complexity metrics are aimed at maintenance not at rules extraction. You may want to come up with your own estimation formulas once you peruse the archeology reports to see what kinds of factors will impact your finding the right code and the time it takes to "mine" from it..
(2) Programming Languages: As for specific languages, some tools process more languages than others. So, this is imortant in selecting a tool.
(3) Selecting the Right Tool: During archeology, we usually compare the functionality we need for each task in the rule extraction process. There are specific reports or specific dynamic capabilities that can speed up the process.
(4) Finding the Rules: Isolating target program code means knowing, ahead of time, where rules are buried and then either slicing out little snippets of code containing a few rules (and operating on these individual snippets), or dynamically using the tool to move forward and backward through the code, in search of boundaries for targetted rules and
(5) Translating code snippets into real rules (you make an excellent point in this regard): Finally, translating those little code snippets (once you have them) into real business-oriented rule statements…today this is mostly manual, although the OMG standards may result in more intelligent parsing….let’s see what Paul Vincent predicts in that regard for the short-term future. Nevertheless, regardless of how functional a specific legacy understanding tool is for rules extraction, it usually is much better than doing it without a tool. (We have often worked with vendors in producing reports specific for rules extraction. Our STEP™ license now includes our BR Mining and contains a list, by task, of the functionality/reports needed from a tool at each step in the rules extractionprocess, which helps in estimating difficulty and timeframes). Please share any experience you have had, whether good or bad or just starting!
The esimations for project planning depend on so many things (a) quantity of software artifacts (b) skill of rule extractor humans <g> (c) knowledge of underlying software (d) quality of the software and its documentation (e) functionality of tool set. (We find it best to do a PoC of 4-8 weeks just to get metrics for estimating the full job).