Skip to content

Decision Optimization

The disparity between sound decisions and optimal ones can equate to millions of dollars.

Decision Optimization, also known as Decision Intelligence, has its roots in military applications dating back to the 1940s and has evolved into a crucial tool for corporate decision-making in the last two decades. This technology replaces subjective human decision-making with mathematical solvers, allowing organizations to tackle complex challenges efficiently. By reverse engineering outcomes, Decision Optimization identifies optimal approaches using an analytics engine, offering real-time solutions for intricate business problems. The process involves inputting objectives, constraints, and questions, with mathematical models exploring countless possibilities to generate a detailed and tailored optimal action plan

Benefits

Decision Optimization is versatile across various industries, accommodating finance, energy, healthcare, distribution, and industrial sectors, among others. One application involves optimizing resource planning—covering human, material, and financial aspects. Automating business operations through decision optimization can enhance process efficiency, cut operating costs, maximize operating margins, and elevate service levels. In retail, advanced analytics can forecast store visitation volume, enabling the adjustment of employee schedules and tasks to align with customer visits. Ensuring the presence of the right employee at the right time and place to assist customers can significantly boost the conversion rate.

 

Robust Optimization

Powerful optimization algorithms, ensuring robust solutions for a wide range of complex problems in diverse industries.

Performance

High-performance optimizers, efficiently handle large-scale optimization problems, delivering fast and reliable results.

Integration

Seamlessly integrate with different programming languages and environments, facilitating incorporation into existing systems and workflows.

Scalability

Scalable for organizations dealing with increasing amounts of data and complex optimization scenarios.

Request a

Our Business Partners in Decision Optimization

Efficient linear programming solvers, capable of solving large-scale linear optimization problems.

Mixed-integer programming (MILP) solver, which is crucial for solving optimization problems with both continuous and discrete variables.

Quadratic programming solving, allowing for the optimization of quadratic objectives subject to linear constraints.

Advanced heuristics and optimization algorithms, allows users to fine-tune various parameters for better performance on specific problem instances.

Capable of solving nonlinear programming problems, allowing for more flexibility in modeling complex relationships.

Constraint programming solver for solving combinatorial optimization problems, where the relationships between variables are expressed through constraints.

Designed to take advantage of parallel processing and multi-core architectures, enhancing the speed and scalability of optimization solutions.

Integrated with several programming languages, including Java, C++, and Python, allowing for seamless integration into existing software applications.

Enroll for

Use Cases & Industry

Dive into the Optimization use cases by Industry to learn about industry-specific solutions. Our Industries Page is a gateway to discovering how we impact businesses across various sectors. Explore the depth of our expertise, tailored services, and success stories that showcase the transformative impact we bring to different industries. Read the following decision optimization success stories to see how our chosen technologies are used to create maximum value.

Prayas (Energy Group)

With help from IBM Business Partner Cresco International, PEG transitioned to a modeling platform based on IBM® ILOG® CPLEX® Optimization Studio software. The organization can now introduce more variables into its simulations and test them more frequently, yielding richer research data to craft policy recommendations for the energy sector.

FleetPride

If a farmer’s tractor breaks down during harvest or a courier’s van has engine issues, they can’t afford to wait long for spare parts to arrive—they’ve got a job to do. Working with Cresco International, FleetPride is transforming its supply chain management with analytics, helping to ensure customers get the parts they need, when they need them.

Leading Bulk Tanker Transportation Company

Bulk tanker logistics is such a complex problem that many trucking companies can’t offer a comprehensive service: they have to specialize in only a small set of products. Optimization software from IBM is helping this leading tanker carrier to buck that trend by finding efficient routes in near-real time—enabling huge cost savings and supporting growth.

What can Cresco International do with Advanced Analytics?

IBM Business Partner Cresco International shares some of its client successes. Hear from Sanjeev Datta, principal and CEO of Cresco International how his company helped its clients increase margins and revenue.

Our Solutions

Workforce Scheduling

Our Workforce Scheduling solution takes every variable & constraint into account to create an optimal output for the scheduling coordinator. By looking at constraints such as the number of hours a nurse can work per week, total cost, allocation of shifts, nurse skillsets, and qualifications, you are able to generate a schedule with the click of a button. This solution provides a user-friendly interface in conjunction with real-time adjustments, and more.

OptimCampaign

With OptimCampaign, companies can maximize short-term and long-term profit, ROI, or any other objective while satisfying critical business constraints such as marketing budget and channel capacities across multiple marketing channels. OptimCampaign leverages Decision Optimization and Machine Learning technologies to provide optimal marketing strategies by solving large-scale marketing problems.

Trade Promotion Optimization

Next to cost of goods sold, trade promotion spending is the largest expense on a manufacturer’s P&L. However, its effectiveness can be very hard to measure. Promotional assessment is most effective with a solution that uses machine learning, modern optimization techniques, and proven advanced analytics strategies. Brücke, the decision support solution brought to you can help you better navigate trade promotion decision making.

Success Stories

Cargo Flash

BP Led Commercial account in India, where we demonstrated IBM CPLEX and beat Gurobi and other open solvers winning the deal. Our consulting team is now training their staff and building Airline Cargo reservation systems for optimization on warehouse management, Revenue optimization, reservation and order optimization, route optimization systems to be embedded into an application for their end customers – who are major airlines around the globe.

Indigo Airlines

One of the largest passenger airlines in Asia flying between UK and China covering all countries in between. Cresco was called in to discuss CPLEX product and compete with Gurobi. We helped in the decision making, technical support of the product and cloud strategy and conversations The client is leveraging IBM CPLEX for Revenue Optimization for now. They plan to extend this for Crew Roster, Scheduling of Ground Staff and 5 other areas in this field. Cresco is training their staff on Data Science and Optimization.

Maersk

IBM Led Enterprise account – Cresco demoed CPLEX and sold them the product. The client is using Microsoft heavily and we are now playing for Watson Studio to handle all their logistics using IBM DSAI. Cresco is building trust with this account and growing the IBM footprint.

Star TV

IBM Led Enterprise account – Cresco sold CPLEX to Asia’s biggest TV network that is utilizing IBM CPLEX for content management and advertising/revenue optimization. We demoed the CPLEX product and the client is self-sufficiently using CPLEX. We will be providing them with CPLEX training and assisting in self-service of the technology.

Jubilant Foods

Owners of Dunkin Donuts and Dominos Pizza franchises across India, Sri Lanka, Bangladesh and Nepal was using open solvers for optimization of ALL workers (man-power) schedules. They needed a solution that would process faster and more accurate. Cresco did a POC with client data in building out models in CPLEX to demonstrate these enhanced features. We converted their code into python and leveraging the IBM CPLEX Solver to schedule assignments and plans for all man-power with the goal to create OPTIMAL wage costs to run operations, reducing solve time by 60% and increasing solving accuracy by over 10%.

Resources

What is Decision Intelligence Technology?

With decision intelligence technology, you input the details about your complex, real-life business problem—and then the technology gets to work, exploring the trillions of ways you can achieve your goals, given your limitations.

Campaign Optimization with OptimCampaign

Efficiently planning marketing campaigns can be tricky. Marketers should answer very tough questions in a short time like how to best allocate a marketing budget across multiple campaigns. OptimCampaign finds the best answers to these tough questions.

CleverShift: A Nurse Scheduling Solution

Manual scheduling can be time-consuming and cause shortages of the workforce & unfair scheduling; this greatly impacts a healthcare organization. CleverShift instantly creates an optimal output for the scheduling coordinator.

Analytics, Data Science, and Optimization at FleetPride

Watch how FleetPride successfully implemented AI to increase revenue.

Sports Scheduling with Decision Intelligence

Retail Space Allocations

Watch our CRESCO Brücke allows retailers and store planners to quickly allocate store space and optimize with scenario planning. Fast, Easy and REAL TIME

Compare Commercial Optimization Solvers

FeatureCPLEXGurobiXpress
Parent CompanyIBMGurobi OptimizationFICO
LicenseCommercialCommercialCommercial
Supported PlatformsWindows, Linux, macOSWindows, Linux, macOSWindows, Linux
Problem Types SupportedLinear Programming, Integer Programming, Quadratic Programming, Mixed-Integer LP/QP, Nonlinear ProgrammingLinear Programming, Integer Programming, Quadratic Programming, Mixed-Integer LP/QP, Nonlinear ProgrammingLinear Programming, Integer Programming, Quadratic Programming, Mixed-Integer LP/QP, Nonlinear Programming
Convex Quadratic ConstraintsSupportedSupportedSupported
Programming Languages SupportedC, C++, Java, .Net, PythonC, C++, Java, .Net, Python, MATLAB, RJava, .Net, Python, R
Solving AlgorithmsPrimal-Dual, Barrier, Simplex, Branch-and-Bound, Cut GenerationPrimal-Dual, Barrier, Simplex, Branch-and-Bound, Cut GenerationPrimal-Dual, Barrier, Simplex, Branch-and-Bound, Cut Generation
Warm Start CapabilitiesSupportedSupportedSupported
HeuristicsSupportedSupportedSupported
Parallel ProcessingSupportedSupportedSupported
Large-Scale OptimizationSupportedSupportedSupported
Distributed ComputingSupportedSupportedSupported
Community SupportActiveActiveActive
IDECPLEX Optimization StudioNoneXpress Workbench
Modeling LanguageOPLNoneMosel
RobustnessStrongStrongStrong
Academic LicensingAvailableAvailableAvailable

Processor Value Units (PVUs)

A Processor Value Unit (PVU) is a unit of measure by which the Program can be licensed. The number of PVU entitlements required is based on the processor technology (defined within the PVU Tables below by Processor Vendor, Brand, Type and Model Number) and by the number of processors made available to the Program. IBM defines a processor, for the purpose of PVU-based licensing, to be each processor core on a chip (socket). A dual-core processor chip, for example, has two processor cores. The tables below list existing generally available processor technologies only, as of the published date. PVU requirements for future processor technologies may differ. 

PVU Table per Core (RISC and System z)

Processor VendorProcessor NameServer model numbersMaximum number of sockets per serverProc. Model NumberPVUs per Core
IBMPOWER IFL, Any POWER system core running LinuxAllAllAll70
IBMPOWER10E1080>4All120
IBMPOWER10E1050AllAll100
IBMPOWER10S1022, L1022, S1022s, S1014, S1024, L1024AllAll70
IBMPOWER9E980>4All120
IBMPOWER9E9504All100
IBMPOWER9H922, H924, S914, S922, S9242All70
IBMPOWER8870, 880> 4All120
IBMPOWER8E8504All100
IBMPOWER8S812, S814, S822, S8242All70
IBMPOWER7 (*4)770, 780, 795> 4All120
IBMPOWER7750, 755, 760, 775, PS704, p460, Power ESE4All100
IBMPOWER7PS700-703, 710-740, p260, p2702All70
IBMPOWER6550, 560, 570, 575, 595AllAll120
IBMPOWER6520, JS12, JS22, JS23, JS43AllAll80
IBMPOWER5, POWER4AllAllAll100
IBMPOWER5 QCMAllAllAll50
IBMz16, z15 Model T01, LinuxOne  III LT1, z14, Models M01-M05 and L01-L05, Emperor, Emperor II, z13, zEC12, z196, System z10 (*1,5)AllAllAll120
IBMz16 A02, z15 Model T02, LinuxOne III LT2, z14 Model ZR1 / LR1, z13s, Rockhopper, Rockhopper II, zBC12, z114, System z9, z990, S/390 (*1,2,6)AllAllAll100
IBMPowerPC 970AllAllAll50
IBMPowerXCell, Cell/B.E. 8i (*3)AllAllAll30
AnyAll othersAllAllAll100
Processor Technologies
Processor Brand
Processor VendorProcessor NameServer model numbersMaximum number of sockets per serverProc. model numbersPVUs per core
HP/Intel®Itanium®AllAllAll100
HP/Intel®PA-RISCAllAllAll100
Oracle / Sun / FujitsuSPARC64 VI, VII, X, X+. XIIAllAllAll100
Oracle / Sun / FujitsuUltraSPARC IVAllAllAll100
Oracle / Sun / FujitsuSPARC M5 / M6AllAllAll120
Oracle / Sun / FujitsuSPARC M7T7-44All100
Oracle / Sun / FujitsuSPARC T4/T5/M7/S7/M8T5-8, M7-8, M7-16, M8-8>4All120
Oracle / Sun / FujitsuSPARC T4/T5/M7/S7/M8T4-4, T5-4, T7-4, T8-44All100
Oracle / Sun / FujitsuSPARC T4/T5/M7/S7/M8T4-1, T4-1B, T4-2, T5-1B, T5-2, T7-1, T7-2, S7-2, S7-2L, T8-1, T8-22All70
Oracle / Sun / FujitsuSPARC T3AllAllAll70
Oracle / Sun / FujitsuUltraSPARC T2AllAllAll50
Oracle / Sun / FujitsuUltraSPARC T1AllAllAll30
AnyAll othersAllAllAll100

Notes:

  1. Each Integrated Facility for Linux (IFL) or Central Processor (CP) engine is equivalent to 1 core.
  2. Refers to System z9, eServer zSeries, or System/390 servers.
  3. Entitlements required for Power Processor Element (PPE) cores only.
  4. The PVU requirement for the POWER7/7+ processor technology is dependent on the maximum possible number of sockets on the server. NOTE: Power 7 Refers to Power 7/7+ 5
  5. z196 refers to IBM zEnterprise 196, zEC12 refers to IBM zEnterprise EC12.
  6. z114 refers to IBM zEnterprise 114.

PVU Table per Core (x86)

Processor VendorProcessor NameProc. Model Number (1)Maximum Sockets Per ServerPVUs Per Core
Intel®Xeon® (2)All post-Nehalem (launched 11/2008) Xeon Processor Models including Xeon Scalable (Platinum, Gold, Silver, Bronze)270
Intel®Xeon® (2)All post-Nehalem (launched 11/2008) Xeon Processor Models including Xeon Scalable (Platinum, Gold, Silver, Bronze)4100
Intel®Xeon® (2)All post-Nehalem (launched 11/2008) Xeon Processor Models including Xeon Scalable (Platinum, Gold, Silver, Bronze)>4120
Intel®Xeon®All pre-Nehalem Xeon Processor Models 3000 to 3399, 5000 to 5499, 7000 to 7499All50
Intel®Core® (3)All i3, i5, i7,i9All70
AMDOpteronAllAll50
AMDEPYCAllAll70
AnyAll othersAllAll100

Notes:

  1. IBM offers Software for both Intel and AMD processors. Intel refers to its processors by “Processor Number” and AMD by “Model Number”. The processor model can be preceded by a letter. For example, ‘x5365 refers to ‘5365’, which is included in the table within the ‘5000 to 5499’ range. 
  2. The PVU requirement for the Intel processor technology indicated is dependent on the maximum number of sockets on the server. If sockets on two or more servers are connected to form a Symmetric Multiprocessing (SMP) Server, the maximum number of sockets per server increases. See single server examples and two or more servers example below.
    Single server examples:
    • 2 socket server with 6 cores per socket requires 840 PVUs (70 per core x 12 cores )4 socket server with 6 cores per socket requires 2400 PVUs ( 100 per core x 24 cores )
    • 8 socket server with 6 cores per socket requires 5760 PVUs ( 120 per core x 48 cores)
      Two or more servers with connected sockets example:
    • When sockets on a 2 socket server with 6 cores per socket are connected to sockets on another 2 socket server with 6 cores per socket, this becomes an SMP server with a maximum of 4 sockets per server and 24 cores, and requires 2400 PVUs (100 per core x 24 cores).
  3. The newest generation of Intel Core processors is not covered by this entry, therefore, excludes processors that use Intel Performance Hybrid Architecture containing two different core types

Start Your Journey for Free

Community edition

This freemium edition (Up to 1000 variables/ contraints) comes with fill features and functionalities for unlimited time. 

Evaluation

Purchase to Develop and Deploy

Subscribe or buy licenses for in-house application development and deployment licenses for commercial usage 

Development subscription

Buy a monthly subscription for in-house application development. Cancel anytime.

Development licenses

Deployment licenses

Visit Cresco's

Featured Blogs

FAQs

CPLEX Frequently Asked Questions

The IBM® ILOG® CPLEX® Optimization Studio subscription allows only development use. There are unlimited variables and constraints, and support is included for as long as the subscription is active.

Yes, we offer IBM ILOG CPLEX Optimization no-cost edition. The no-cost edition is restricted to problems up to 1,000 variables and 1,000 constraints. A time-limited trial without restrictions on problem size is available upon request by contacting an IBM sales representative.

Beyond the commercial version, IBM runs the IBM Academic Initiative where students and faculty members have access to the product at no cost. Our academic entry is without functional or model size limitations, and you must register to check your eligibility. If the program accepts you, then you will get an email notification with download instructions.

The CPLEX subscription can be purchased as a monthly or annual subscription and is charged at the beginning of the billing period. You will be automatically billed on a periodic basis, according to the terms of your subscription.
Before you reach the next month of your subscription offering, contact us to cancel your automatic renewal.
A single subscription may include one or more users, and each user receives a personal key to unlock the software. Each member of your team who uses IBM ILOG CPLEX Optimization Studio needs to have his/her own key. Users can be managed from My Products and Services. When you remove a user from the subscription, his or her key becomes inactive. When you add a user to the subscription, he or she receives a key.
Each day when the first “solve” happens, the software goes to check eligibility to use the product unlimited. However, after at least one check, the user can be offline for 14 days before the product goes back to the limited mode. If the user connects and eligibility is checked before the end of the 14th day, then the user again could be offline for another 14 days.
Yes, support is included for the purchased months for subscription and 12 months for the other licensing options.
The IBM ILOG CPLEX Optimization Studio subscription is supported on Windows, macOS, and Linux (on x86 platforms). Please refer to the system requirements information on the supported releases of this product.

Gurobi Frequently Asked Questions

New customers regularly tell us that migrating was easier than they expected, and that they are happy they made the switch to Gurobi.

Gurobi is a special kind of software called a “solver.” But Gurobi doesn’t have a graphical interface the way your familiar consumer apps do. You interface with it through programming languages like Python, AIMMS, and R—so you have to know how to code. And you need to know how to create a mathematical model.

Don’t have those skills in-house? We have a network of trusted service partners who are ready to help.

And at any point along the way, the Gurobi Expert team is here to help with troubleshooting and tuning your mathematical models. We also offer customized training for groups that need help with modeling techniques, model tuning, etc.

Machine learning looks for patterns in historical data and uses those patterns to make predictions about the future. But what happens when your future no longer looks like your past?

With Gurobi, you can make decisions that don’t rely on your past data. You input what you want to achieve, and Gurobi identifies your best set of decisions. And if something changes along the way, no problem! Just adjust your inputs and run it through Gurobi again.

You’ll also need to know how to create a mathematical model. People who know how to code (like data scientists, for example) can pretty easily pick up this skill. Check out our examplecode and basic training to get started.

We don’t currently offer that specific service. But we have trusted partners who do. And the Gurobi Experts team can help customers troubleshoot and tune their models anytime, at no cost. We also offer customized training for groups that need help with modeling techniques, model tuning, etc.

Other decision models—like decision rules or heuristics—can result in sub-optimal decisions because they explore only a tiny percentage of possible solutions. Gurobi, by contrast, can provide provable optimality. And for a business, the difference between “sub-optimal” and “optimal” decisions can be millions in revenue.

Gurobi is a complex product, and some components and settings require expert knowledge. Whether you are migrating from a different solution or starting a project from scratch, we are happy to answer your “how to” questions and point you to the right resources.

Common customer requests:

  • How do I introduce multiple objectives to my optimization model?
  • I need to incorporate a complex business rule into our existing model. Can you recommend a way to model this or point me in the right direction?
  • I want to migrate my solution to Gurobi Instant Cloud. Can you help me extend my existing implementation?

Gurobi is preconfigured with settings that generally work well across a broad set of models. However, it is likely that for a specific set of models, we can find specialized settings that perform even better. We search for such settings by combining our deep understanding of the solver with the power of a large computing cluster.

Common customer requests:

  • We need to solve an optimization problem every 5 minutes. What settings should we use to consistently find the best possible solutions in this timeframe?
  • Can you help us reduce the time it takes to find high-quality solutions for our model instances?
  • Is distributed optimization the right choice for our use case?
  • We are using a variety of custom parameter settings that improved performance when using previous versions of Gurobi. Are these settings still beneficial for the latest version of Gurobi?

Using software libraries the right way can make a significant impact. We can help you use our APIs efficiently, thereby reducing the time required to construct and interact with the model.

Common customer requests:

  • Our model-building phase is far more time-consuming than the actual optimization process. Can you help us improve our Python implementation?
  • We have migrated our model to use Gurobi’s native API. Can you help us review our implementation?
  • Could you review this code snippet and let us know if we are calling these functions properly?

Modern applications leverage the power of scalable virtualized environments—but not all of these environments fit or scale well for optimization applications.

Whether you are building a new optimization application or updating an existing application, we can recommend architecture deployments that satisfy your technical needs. We make sure you are comfortable with any migration processes involved, and we can assist during upgrades to reduce the risk of service interruptions.

Common customer requests:

  • Can you please help us set up our Compute Servers in a high- availability deployment?
  • We are upgrading our current architecture to use microservices. Considering our models and business case, should we run our optimization processes directly on worker nodes or offload the computation to dedicated servers in the same cluster?
  • Considering our benchmark results and projected production optimization usage, what would you recommend for our on-premises architecture?

As you develop your optimization application, you may encounter questions like why the solver behaves in a certain way or how a change to the model leads to a particular outcome. We can help you understand exactly what is happening.
Common customer requests:

  • The numeric statistics of my model seem reasonable. Why does the solve time vary significantly after the data changes slightly?
  • I doubled the number of cores used to solve my models. Why don’t I see significant speed improvements? What is more important for my model: a faster clock speed or more cores?
  • Can you help me understand what’s happening in the log file?
  • Why does Gurobi display warnings, and what can I do about it?

After successfully tuning your models to achieve the best performance on your hardware, you may want to assess if further solution time reductions are possible. You are considering different ideas, such as reformulating your model, exploiting the knowledge of your problem, etc. We can help you by engaging in technical discussions based on your ideas and use case. Our goal is to make your implementation efforts more effective and get the most out of the solver.

Common customer requests:

  • We are not happy with the overall performance of our current implementation. Can you help us explore a decomposition approach for which we can leverage Gurobi’s performance on the resulting subproblems?
  • We have observed that for some of our model instances, feasible solutions are found quickly, but it is taking considerable time to prove optimality. Do you see any obvious improvements we can make to our formulation?
  • We have gained insight into our problem instances and would like to refine our solution strategy to reduce our solve times. Can you help review our callback implementation to incorporate custom cuts to the solution process and define a more advanced termination criterion?

FICO Frequently Asked Questions

For the past sixty years, FICO has established itself as a leader in decision analytics. More recently, FICO has developed the leading decision platform for AI and ML development. The FICO Developer Experience builds on the strengths of FICO analytics, advocating for developers, providing enhanced features, open APIs, and information and resources to help you work more efficiently.

Building on the FICO Platform enables you to design and deploy decisioning applications quickly, accelerating the time it takes from product concept to realizing the application’s value.

 

The cloud capabilities and architecture of FICO Platform, including its use of containers, microservices, APIs, and related technologies can be referenced in the following white paper titled: FICO Platform Reference Architecture. We also provide best practices for microservices and go into more technical detail about the standardized analytic and execution services underpinning the platform. Finally, we cover our comprehensive approach to security and cloud-managed services.

Improved Productivity: These individual services are small, autonomous, and self-contained. Breaking an application down into discrete parts makes it easier to build and maintain. They communicate with other services using well-defined, standardized APIs. This allows different microservices to communicate even when they use different technologies or programming languages.

Better Resiliency: A microservices architecture is distributed and loosely coupled, meaning that a failure in one service won’t take down other services. Individual services can be scaled to meet a surge in demand rather than needing to scale the whole application. This requires that a microservices architecture provide auditability and traceability so applications can be packaged, deployed, and scaled in a consistent manner.

Increased Scalability: Microservices liberate applications and business processes from the complex dependencies of monolithic architectures. Traditional monolithic architectures run as an aggregate service, so a single failure at one point can bring down a whole application, and the entire system must be scaled to meet a surge in demand for one of its components. It also means applications become increasingly problematic as the code base grows, stifling innovation, consuming IT infrastructure, and complicating upgrades.

A microservice architecture allows software to be developed for a specific business process or service, so companies can build applications that are fully aligned with their business needs with the agility to respond to market changes with unprecedented speed.

 

A container is a package that includes an application or service, its configuration, and all its dependencies, such as code, system tools, libraries, settings, runtime, and anything else required to execute the program. A container allows an application to be decoupled from the environment in which it runs, making it easily and consistently deployable across environments.

FICO Platform Core offers open application programming interfaces (APIs) that allow other software vendors and customers to extend the platform with complementary capabilities or connect it to other enterprise systems such as CRM or ERP. Fundamentally, FICO Platform Core utilizes Kubernetes. Kubernetes is an open-source container orchestration system for automating application deployment, scaling, and management. Kubernetes enables FICO Platform, the FICO solution for developing analytically powered, decision automation solutions, to support seamless and cost-effective deployment on public and private clouds as well as across hybrid cloud deployments.

FICO Platform along with the Community is free to use, join and participate. The various solutions, analytic components, and tools have a cost to use that may be dependent on the use case, deployment model, size of business or other factors. Feel free to leverage the FICO Community to determine which components may be right for you and inquire directly with FICO on how to integrate the components or solutions and build an analytically powered solution that works for you.

Certain solutions available through the FICO Platform are enterprise solutions intended for multiple users within a given organization. When subscribing to a multi-user solution, the subscriber’s application administrator will have rights to add (or remove) users and define roles within the limits of the subscription. Other FICO Platform solutions are single-user applications, with each user assigned to an individual subscription.

To use FICO Platform, in most instances, you will not need to install any software. In general, all the functionality and capabilities of FICO Platform will be supported as a cloud-based, software-as-a-service (SaaS) solution. Some specialized applications provide specific functionality which will require the installation of local components. Please reach out on the FICO Community with any questions.

FICO Platform is currently provided in English (only). Certain applications and tools available through FICO Platform are available in other languages. For details about our cloud offerings, see the product descriptions available via the DECISION MANAGEMENT SUITE section of FICO Platform. If you would like more information about language options for any specific solution, please contact us.

 

For product support or technical help with FICO Platform, please contact Support. You must log in to your FICO Platform account to access the “Request Support” link from the portal. For questions regarding pricing, product info, or to be contacted by a FICO sales representative, please contact us. To submit a general FICO Platform portal support question without logging into the portal, submit the Contact Us form with the “FICO Platform Support” nature of inquiry checkbox selected. A FICO support representative will follow up shortly.

FICO offers technical support and services ranging from self-help tools to direct assistance with a FICO technical support engineer. Support is available to all clients who have purchased a FICO product and have an active support or maintenance contract. You can find support contact information and a link to the Customer Self Service Portal (online support) on the Product Support home page.

The FICO Customer Self Service Portal is a secure web portal that is available 24 hours a day, 7 days a week from the Product Support home page. The portal allows you to open, review, update, and close cases, as well as find solutions to common problems in the FICO Knowledge Base.

Decision Optimization Frequently Asked Questions

In linear programming, feasibility ensures that the solution lies within the feasible region defined by the constraints. This is essential for accurate and meaningful optimization results.

Identifying a feasible solution is crucial as it lays the foundation for further optimization processes. It ensures that the solution adheres to the specified constraints and is a potential candidate for optimization.

Computational methods rely on numerical approximations and algorithms suitable for computer implementation, while analytical methods aim for exact, symbolic solutions.

Examples include Gaussian elimination for solving linear systems, iterative methods for eigenvalue computations, and numerical algorithms for singular value decomposition.

The accuracy of computational methods is crucial, as numerical errors can accumulate. Therefore, choosing appropriate algorithms and considering precision in computations is vital for obtaining reliable linear algebra solutions.

An example could be the optimal allocation of production resources among different products to maximize profit, considering constraints like production capacity and resource availability.

Sensitivity analysis helps assess the impact of changes in coefficients or constraints on the optimal solution, providing insights into the robustness of the allocation plan.

Branch and bound is an algorithmic technique used in integer programming to systematically explore feasible solutions by dividing the problem into smaller subproblems and bounding the solution space.

The cost of IBM CPLEX is influenced by factors such as the type of license (academic, commercial), the edition (e.g., CPLEX Optimization Studio), and the number of users or cores required for deployment.

IBM CPLEX generally offers a free trial version for users to explore its features. However, for long-term or commercial use, a purchased license is required.

Cresco University offers online courses on linear programming.

Prerequisites vary, but a basic understanding of algebra and mathematical concepts is usually recommended. Some courses may also require familiarity with introductory optimization principles.

The duration of online courses varies. Introductory courses may take a few hours, while more comprehensive courses covering advanced topics may extend to a few weeks.

Yes, the online courses offered by Cresco University incorporate practical applications and hands-on exercises to reinforce theoretical concepts. This ensures that learners can apply their knowledge to real-world problems.

A MIP solver, or Mixed Integer Programming solver, is a software tool or algorithm designed to find optimal solutions for optimization problems that involve a mix of both continuous and integer decision variables.

MIP solvers can handle large-scale optimization problems, but the computational complexity may increase with problem size. Efficient implementation and algorithmic advancements aim to address scalability issues.

While a feasible solution meets all constraints, it may not be the most optimal. An optimal solution, on the other hand, is the best among all feasible solutions in terms of maximizing or minimizing the objective function.

Yes, a problem can have multiple feasible solutions. In fact, it is common for optimization problems to have a range of feasible solutions, and the challenge lies in finding the best one.

In linear programming, a feasible solution is a vector of decision variables that satisfies all the system’s constraints, making it a valid solution within the feasible region.

To check feasibility, substitute the decision variable values into the constraints. If all constraints are satisfied, the solution is feasible; otherwise, adjustments are needed.

Yes, it is possible for a linear programming problem to have no feasible solution if the constraints are contradictory or incompatible.

An allocation problem in linear programming involves distributing resources among competing activities or entities in an optimal manner, typically subject to constraints on resource availability.

Branch and bound incorporates branching steps that explore both integer and non-integer solutions, allowing the algorithm to converge towards an optimal integer solution.

While primarily designed for integer programming, branch and bound can be adapted for continuous problems, but it excels in scenarios where integer solutions are critical.

Computational methods in linear algebra involve using numerical techniques to solve mathematical problems related to matrices, vectors, and linear equations using computers.

Yes, IBM CPLEX pricing is often customizable to cater to the unique needs of users and organizations, allowing flexibility in choosing features and capabilities.

You can download IBM CPLEX from the official IBM website, where trial versions, updates, and patches are usually available for registered users.

Yes, registration on the IBM website is typically required to access the CPLEX downloads. This allows IBM to track usage, provide support, and manage licenses.

 

Both one-time payment and subscription options may be available. Subscription options often include ongoing support, updates, and access to new versions during the subscription period.

Yes, users can typically request a quote from IBM to get a customized pricing estimate based on their specific requirements and deployment scenarios.

The CPLEX Python API is a programming interface that allows users to interact with and control IBM CPLEX optimization engines using the Python programming language.

To use the CPLEX Python API, you need to install the CPLEX Optimization Studio and then import the CPLEX module into your Python scripts.

The CPLEX Python API enables users to create, modify, and solve optimization problems using CPLEX. It also allows users to retrieve and analyze optimization results within their Python environment.

Yes, the CPLEX Python API is designed to integrate seamlessly with popular Python libraries such as NumPy and pandas, facilitating the handling of data for optimization problems.

An example could be optimizing the allocation of resources (e.g., workforce, production machines) to meet product demand across different locations, considering transportation costs and storage limitations.

Linear programming in the supply chain optimizes resource allocation, production planning, and distribution to minimize costs while meeting demand, taking into account constraints like capacity and logistics.

Linear programming is applied in various industries, including manufacturing, logistics, retail, and transportation, where optimizing resource allocation and minimizing costs are critical for success.

The maximum flow problem involves finding the maximum amount of flow that can be sent from a designated source to a target in a flow network, subject to capacity constraints on the edges.

MILP is applied to problems where decisions involve a mix of continuous and discrete choices, such as production planning with discrete batch sizes or facility location problems with integer variables.

MILP problems can become computationally intensive, especially as the number of integer variables increases. Balancing the precision of the solution with computational resources is a common challenge.

Real-time decision-making with MIP solvers depends on the problem size and complexity. In many cases, MIP solvers provide near-optimal solutions within reasonable time frames, making them suitable for certain real-time applications.

SciPy’s optimization module, especially the scipy.optimize package, is commonly used for nonlinear optimization in Python. Other libraries, such as Pyomo and CasADi, also offer nonlinear optimization capabilities.

Advanced Optimization Frequently Asked Questions

Formulating a real-world problem as a mathematical optimization model involves translating the key decisions, objectives, and constraints into mathematical expressions. Common techniques include linear programming (decision variables, linear objective and constraints), mixed-integer programming (adding binary or integer variables), quadratic programming (quadratic objective or constraints), and other model types depending on the problem structure. Logical conditions can be modeled using binary variables and constraints, piecewise functions can be approximated using variable disaggregation or special ordered set (SOS) constraints, and nonlinear relationships can be directly included in nonlinear models or approximated using piecewise-linear functions.

The choice of optimization model type involves trade-offs between model accuracy and computational complexity. Linear models are generally easier to solve but may oversimplify the problem, while nonlinear models can better represent real-world complexities but may be more challenging computationally. Incorporating integer variables increases model realism but also combinatorial complexity. Model size is also a key consideration, with larger models potentially requiring more time and memory to solve. In some cases, reformulating the model or exploiting specific structures (e.g., network flows, covering/packing constraints) can improve solver performance.

Optimization under uncertainty can be addressed using techniques like stochastic programming (explicitly modeling uncertain parameters and optimizing over scenarios), robust optimization (optimizing against the worst-case realization of uncertain parameters), and chance-constrained optimization (allowing constraint violations with a specified probability). These approaches require generating representative scenarios or distributions for the uncertain parameters, as well as incorporating risk measures (e.g., expected value, conditional value-at-risk) into the objective function or constraints. Applications with uncertain demand, supply, prices, or other parameters can benefit from these techniques to improve solution robustness and manage risk.

 

Optimization solvers employ a variety of algorithms to solve different classes of problems. For linear programming, the simplex algorithm and interior point methods are commonly used. For mixed-integer linear programming, branch-and-bound algorithms are widely employed, often combined with cutting planes and other techniques. For nonlinear programming, methods like sequential quadratic programming, interior point, and outer approximation are popular choices. Each algorithm has its strengths and weaknesses, with some better suited for specific problem structures or characteristics (e.g., simplex for sparse LPs, interior point for dense LPs, outer approximation for non-convex NLPs).

Several techniques can be employed to accelerate the solution process and reduce computation time. Presolving techniques, such as constraint propagation, variable fixings, and coefficient tightening, can simplify the model before solving. Cutting planes (valid inequalities that tighten the feasible region) and primal heuristics (quickly generating good feasible solutions) can improve the performance of branch-and-bound algorithms. Additionally, careful problem formulation, scaling, and tuning solver parameters (e.g., tolerances, branching rules, cut generation) can significantly impact solution times. Leveraging parallel processing capabilities, either within the solver or across multiple solver instances, can also provide computational speedups.

Optimization solvers employ various techniques to manage numerical issues and ensure solution quality. Presolving and scaling methods can improve numerical conditioning and reduce the impact of ill-conditioning and degeneracy. Techniques like basis repair and refactorization are used to maintain numerical stability during the solution process. Solvers also enforce feasibility and optimality tolerances to account for numerical precision limitations, allowing for slightly infeasible or suboptimal solutions within specified thresholds. Solver status reports and solution quality indicators (e.g., duality gaps, constraint violations) provide users with information to assess the reliability and accuracy of the reported solutions.

Multi-objective optimization problems involve optimizing multiple conflicting objectives simultaneously. Common techniques include scalarization methods (e.g., weighted sum, epsilon-constraint) that combine the objectives into a single scalar function, and generating Pareto frontiers that characterize the set of optimal trade-offs between objectives. Goal programming can also be used to prioritize and achieve target levels for each objective. In many applications, such as portfolio optimization, product design, and resource allocation, decision-makers must balance competing objectives like cost, performance, and risk. These techniques provide a systematic way to explore and identify preferred solutions from the set of Pareto-optimal alternatives.

Large-scale optimization problems can be addressed through decomposition methods that exploit problem structure and divide the problem into smaller subproblems. Examples include Benders decomposition (splitting complicating variables), Dantzig-Wolfe decomposition (convexifying complex constraints), and column generation (generating variables as needed). Distributed optimization algorithms, such as alternating direction method of multipliers (ADMM) and progressive hedging, can also be used to distribute the computation across multiple solvers or computing nodes. Careful memory management, leveraging sparse data structures, and exploiting parallelism can further aid in solving large-scale problems. Applications in areas like energy systems, supply chain optimization, and traffic routing often involve large-scale or distributed optimization problems.

Most commercial and open-source optimization solvers provide APIs (application programming interfaces) and callback functions that allow users to embed the solvers within larger applications or workflows. These APIs support various programming languages (e.g., C++, Python, Java) and communication protocols (e.g., file-based, in-memory, remote). Callback functions enable users to interact with the solver during the solution process, querying information or injecting application-specific logic (e.g., lazy constraints, user cuts, heuristic solutions). Techniques like algebraic modeling languages and solver-independent modeling layers can facilitate integration with databases, simulation tools, spreadsheets, and decision support systems. Such integration is valuable in applications like supply chain optimization, energy system planning, and financial engineering.

Logical disjunctions (e.g., “either X or Y must hold”) and implied constraints (e.g., “if X, then Y”) can be challenging to model directly in optimization problems. The big-M reformulation is a common technique that introduces binary variables and large “big-M” coefficients to enforce the logical conditions. For example, a disjunction like “X ≥ 5 or Y ≤ 10” can be modeled as X ≥ 5 – M*(1-z), Y ≤ 10 + M*z, where z is a binary variable and M is a large positive constant. Convex hull formulations seek to tighten these big-M constraints by deriving the convex hull of the disjunctive set. Disjunctive programming extends this idea, providing an algebraic modeling framework for representing disjunctions directly. These techniques can significantly reduce solution times by removing symmetry and providing stronger formulations.

Piecewise linear functions, which are continuous but have different slopes over different domains, arise in many applications like transportation, production planning, and pricing problems. Variable disaggregation with special ordered set (SOS) constraints is a common approach, where the domain is partitioned into intervals, with a new variable and SOS constraint representing the function over each interval. Convex combination models use weighting variables to blend multiple linear functions into a piecewise linear approximation. Multiple choice constraints provide another option, enforcing that exactly one linear function is active within each domain partition. While tighter formulations can improve solution times, the increased model size from introducing additional variables and constraints is an important trade-off to consider.

Conditional Value-at-Risk (CVaR) is a widely used risk measure that captures the expected loss beyond a specified risk level (Value-at-Risk). Nested risk constraints allow enforcing risk limits across multiple levels (e.g., individual assets and the overall portfolio). These risk measures can be incorporated into stochastic optimization models using mixed-integer reformulations or approximation techniques. For example, CVaR can be modeled by introducing an auxiliary variable and a set of constraints that enforce the CVaR definition. Nested risk constraints can be modeled using disjunctive constraints or by decomposing the problem into a nested sequence of CVaR sub-problems. Applications in finance (portfolio optimization), reliability engineering, and supply chain risk management often require managing CVaR and nested risk constraints.

Many real-world optimization problems exhibit significant sparsity, meaning most decision variables or constraint coefficients are zero. Solvers use sparse matrix data structures and algorithms to store and operate only on the non-zero elements, reducing memory requirements and computational effort. Graph partitioning and nested dissection techniques can further improve performance by reordering variables and constraints to minimize fill-in during matrix factorizations. Structure-exploiting algorithms like the network simplex method and basis crossover techniques are designed for specific structures like network flows and embedded networks. These methods can provide significant speedups for structured problems compared to general-purpose simplex implementations.

When solving a sequence of related optimization problems, such as in real-time planning or scenario analysis, warm-starting and reusing information from previous solves can greatly accelerate solution times. The solver can be initialized with an advanced start, providing incumbent solutions or partial solutions as an initial point. Reusing basis information from a related problem can reduce the work needed to re-optimize after minor changes. The solver may also be able to reuse cutting planes, lower/upper bounds, or other information generated during previous solves. Some solvers support compression and reuse of the branch-and-bound tree across problems. Careful implementation is needed to properly manage solution pools and caching across related optimization instances.

Parallel and distributed computing techniques can significantly improve solution times for large optimization problems. At a low level, many solver components like matrix operations and cut generation can be parallelized. For branch-and-bound solvers, multiple nodes of the solution tree can be explored in parallel, with careful load balancing and communication. Decomposition methods like Benders or Dantzig-Wolfe decomposition are also amenable to parallelization, with subproblems solved independently and periodically coordinated. High-performance computing (HPC) implementations using technologies like MPI and CUDA can leverage large compute clusters and GPU acceleration. Key challenges include managing overhead from communication, synchronization, and load imbalances across parallel tasks or distributed solver instances.

Many engineering design, control, and simulation optimization problems involve black-box or simulation-based objective functions where derivatives are unavailable or impractical to calculate. Derivative-free optimization (DFO) methods like pattern search, NEWUOA, and model-based techniques (EGO/CORS) can address these problems. Pattern search methods generate a sequence of trial points by perturbing the current best solution. Model-based methods like EGO (Efficient Global Optimization) build surrogate models of the black-box function and use statistics to balance exploration and local optimization. The sample average approximation (SAA) method converts the stochastic simulation into a sample average approximation problem that can be optimized with standard techniques. Combining DFO algorithms with optimization solvers often requires customized implementations or solver-coupling frameworks.

MINLPs, which combine nonlinear functions with discrete variables, are among the most challenging optimization problems. Outer approximation algorithms iteratively build piecewise-linear approximations of the nonlinear functions and solve a sequence of mixed-integer linear programs (MILP). Generalized Benders decomposition splits the MINLP into a master MILP and a continuous nonlinear subproblem, adding cuts derived from thesubproblem solutions. Extended cutting plane methods generate linear cutting planes from the nonlinear functions to tighten the MILP approximations. Other techniques include branching on nonlinear terms, nonlinear programming-based heuristics, and convex relaxations. Non-convexities and weak relaxations make it challenging to provide optimality guarantees for general MINLPs, but these algorithms can often find high-quality solutions or prove optimality for specific problem classes.

Robust optimization seeks solutions that are immune to realizations of uncertain data within a specified uncertainty set. This can be reformulated as a deterministic problem by replacing uncertain coefficients with their worst-case values from the uncertainty set. Chance-constrained programs require constraints to be satisfied with a specified probability level. Safe approximations for chance constraints include scenario-based methods that enforce the constraints over a finite sample of scenarios, and analytical approximations using conic duality or Bernstein approximations. To solve these reformulated robust or chance-constrained problems, techniques like column-and-constraint generation can be employed, iteratively generating variables/constraints to approximate the robust counterpart. Applications in areas like network design, portfolio optimization, and energy systems often require robustness to parameter uncertainty.

Please enter you email to view this content.