Many countries and industries have legal restrictions on data use, sharing and privacy. Some also have restrictions on how to segment customers for marketing treatment to prevent preferential treatment or discrimination. US lenders, for example, cannot use race or gender to make decisions on credit applications. That said, predictive analytics can often help with privacy issues by allowing decisions to be made, by customer service representatives or websites, without having to expose the details of the data to the front-end application. A model can be run, on a secure server, and just the resulting score/recommendation passed to the potentially less secure front-end. Thus a customer service representative need not see your past bankruptcy even though this has been used as part of tailoring a credit offer to you. Many times the outrage around data mining is in fact concern over data privacy and many kinds of model can be built without having to know which record is which person, as long as patterns can be matched and analyzed. In addition, models can help with compliance issues by replacing potentially biased human decision-making with documented, mathematically precise decision-making.
In general, look for an experienced model developer who knows how data and models can and can’t be used in your industry and your region and who understands what regulatory questions may be asked.