Examples
The following examples demonstrate the basic functionality provided by the MOEA Framework. Links to the full source code are provided alongside each code snippet. You may also find these and more examples in the demo application on the downloads page.
- Setup
- Example 1: Simple Run
- Example 2: Quality Indicators
- Example 3: Customizing Algorithms
- Example 4: Statistical Comparison of Algorithms
- Example 5: Collecting Runtime Dynamics
- Example 6: Defining New Problems
Setup
In order to run these examples or use the MOEA Framework, Java 8 (or a later version) must be installed on your computer. The Java 8 development kit (JDK) for Windows and Linux can be downloaded here.
To run these examples, first download and extract the latest compiled binaries from the downloads page. Windows users may extract the downloaded file using 7-zip. The files will extract to a folder called MOEAFramework-4.5. This folder will look similar to:
- MOEAFramework-4.5/
-
- docs/
- examples/
- javadoc/
- lib/
- licenses/
- pf/
- COPYING
- launch-diagnostic-tool.bat
- moeaframework.properties
- README.md
All of the examples below are in the examples/ folder. You may compile and run an example using the following commands. Run these commands in the Command Prompt from the MOEAFramework-4.5 folder.
javac -cp "examples;lib/*" examples/Example1.java java -cp "examples;lib/*" Example1
If you receive the message 'javac' is not recognized as an internal or external command, operable program or batch file, try the following steps to setup your environment on Windows or Linux. Unix/Linux users should replace the semicolons (;) with colons (:).
Example 1: Simple Run
Create and solve the bi-objective DTLZ2 test problem using NSGA-II. At the end, we display the Pareto front.
Problem problem = new DTLZ2(2); NSGAII algorithm = new NSGAII(problem); algorithm.run(10000); algorithm.getResult().display();
Displaying the Pareto front will print all the decision variables, objectives, an constraints (if any).
Var1 Var2 Var3 Var4 Var5 Var6 Var7 Var8 Var9 Var10 Var11 Obj1 Obj2 -------- -------- -------- -------- -------- -------- -------- -------- -------- -------- -------- -------- -------- 1.000000 0.500094 0.505473 0.502753 0.500365 0.499805 0.500569 0.496846 0.503417 0.500038 0.501108 0.000000 1.000061 0.000000 0.497015 0.523063 0.490592 0.485922 0.497590 0.505783 0.496919 0.504620 0.502227 0.506841 1.000949 0.000000 0.242941 0.486582 0.498399 0.510138 0.518632 0.499523 0.501980 0.504977 0.503947 0.499732 0.501241 0.928696 0.372669 0.178412 0.513125 0.503748 0.492867 0.499933 0.500295 0.502367 0.494850 0.468585 0.501130 0.508556 0.962265 0.276964
Example 2: Quality Indicators
Quality indicators are used to compare results between different algorithms. Here, we calculate the hypervolume and generational distance relative to a reference set.
NSGAII algorithm = new NSGAII(problem); algorithm.run(10000); NondominatedPopulation approximationSet = algorithm.getResult(); NondominatedPopulation referenceSet = NondominatedPopulation.loadReferenceSet("pf/DTLZ2.2D.pf"); Indicators indicators = Indicators.all(problem, referenceSet); indicators.apply(approximationSet).display(); }
Running this program produces the following output:
Indicator Value -------------------------------- -------- Hypervolume 0.209256 GenerationalDistance 0.001027 GenerationalDistancePlus 0.002517 InvertedGenerationalDistance 0.004491 InvertedGenerationalDistancePlus 0.002956 AdditiveEpsilonIndicator 0.010052 Spacing 0.005918 MaximumParetoFrontError 0.042614 Contribution 0.000000 R1Indicator 0.446108 R2Indicator 0.000276 R3Indicator 0.000419
Example 3: Customizing Algorithms
The examples above used default parameters for each algorithms. But, each algorithms is customizable! In this example, we setup NSGA-II to use the Parent Centric Crossover (PCX) operator, use a population size of 250, and include an archive to store the best solutions.
Problem problem = new DTLZ2(2); NSGAII algorithm = new NSGAII(problem); algorithm.setInitialPopulationSize(250); algorithm.setVariation(new PCX(5, 2)); algorithm.setArchive(new EpsilonBoxDominanceArchive(0.01)); algorithm.run(10000); algorithm.getResult().display();
Example 4: Statistical Comparison of Algorithms
Larger experiments are performed using the Executor and Analyzer. The Executor is resonsible for configuring and running algorithms. Unlike the previous example where we solved one problem, the Executor is useful when running multiple seeds, multiple algorithms, or different configurations. The Analyzer performs the statistical analysis. It can compute various performance indicators, including hypervolume, generational distance, inverted generational distance, additive ε-indicator, spacing and contribution. Additionally, Kruskal-Wallis and Mann-Whitney U tests measure the statistical significance of results.
String problem = "UF1"; String[] algorithms = { "NSGAII", "GDE3", "eMOEA" }; Executor executor = new Executor() .withProblem(problem) .withMaxEvaluations(10000); Analyzer analyzer = new Analyzer() .withProblem(problem) .includeHypervolume() .showStatisticalSignificance(); for (String algorithm : algorithms) { analyzer.addAll(algorithm, executor.withAlgorithm(algorithm).runSeeds(50)); } analyzer.display();
Running this script produces the output shown below. We can see that GDE3 and NSGA-II produce the best (largest) hypervolume values. Furthermore, we have determined statistically that there is no significant difference in performance between GDE3 and NSGA-II.
Algorithm Indicator Min Median Max IQR (+/-) Count Statistically Similar (a=0.05) --------- ----------- -------- -------- -------- --------- ----- ------------------------------ eMOEA Hypervolume 0.318756 0.464230 0.533881 0.081216 50 GDE3 Hypervolume 0.437182 0.502810 0.531823 0.031219 50 NSGAII Hypervolume 0.319489 0.514978 0.544461 0.031242 50
Example 5: Collecting Runtime Dynamics
Runtime dynamics provide insight into the behavior of an optimization algorithm throughout a run. For instance, one can observe how solution quality changes with the number of function evaluations (NFE). The Instrumenter class records the runtime dynamics.
Instrumenter instrumenter = new Instrumenter() .withProblem("UF1") .withFrequency(100) .attachGenerationalDistanceCollector(); new Executor() .withProblem("UF1") .withAlgorithm("NSGAII") .withMaxEvaluations(10000) .withInstrumenter(instrumenter) .run(); instrumenter.getObservations().display();
The output from this script, shown below, shows how the generational distance metric changes over time. We see that NSGA-II is rapidly converging to the reference set (the optimal solutions) since its generational distance is converging to 0.
NFE GenerationalDistance ----- -------------------- 100 0.799030 200 0.707753 300 0.438113 400 0.383873 500 0.431799 600 0.372148 700 0.344861 800 0.294252 900 0.294386 1000 0.293309
Example 6: Defining New Problems
While we provide many test problem suites for comparing optimization algorithms, we can also introduce new problems. As demonstrated below, we need to define two methods: newSolution and evaluate. The newSolution method defines the problem representation (the number and types of its decision variables). The evaluate method takes a solution and computes its objective function values.
public static class Srinivas extends AbstractProblem { public Srinivas() { super(2, 2, 2); } @Override public void evaluate(Solution solution) { double x = EncodingUtils.getReal(solution.getVariable(0)); double y = EncodingUtils.getReal(solution.getVariable(1)); double f1 = Math.pow(x - 2.0, 2.0) + Math.pow(y - 1.0, 2.0) + 2.0; double f2 = 9.0*x - Math.pow(y - 1.0, 2.0); double c1 = Math.pow(x, 2.0) + Math.pow(y, 2.0); double c2 = x - 3.0*y; solution.setObjective(0, f1); solution.setObjective(1, f2); solution.setConstraint(0, Constraint.lessThanOrEqual(c1, 225.0)); solution.setConstraint(1, Constraint.lessThanOrEqual(c2, -10.0)); } @Override public Solution newSolution() { Solution solution = new Solution(2, 2, 2); solution.setVariable(0, new RealVariable(-20.0, 20.0)); solution.setVariable(1, new RealVariable(-20.0, 20.0)); return solution; } }
Then, we can solve this problem using:
Problem problem = new Srinivas(); NSGAII algorithm = new NSGAII(problem); algorithm.run(10000); algorithm.getResult().display();
Conclusion
In addition to the above examples, we provide many more inside the examples folder. Navigate to the downloads page to download the MOEA Framework or visit our GitHub page for more information.