NiaPy.algorithms

Module with implementations of basic and hybrid algorithms.

class NiaPy.algorithms.Algorithm(**kwargs)[source]

Bases: object

Class for implementing algorithms.

Data: 2018

Author: Klemen Berkovič

License: MIT

Initialize algorithm and create name for an algorithm.

Arguments:

name {string} – Full name of algorithm

shortName {string} – Short name of algorithm

NP {integer} – population size

D {integer} – dimension of problem

nGEN {integer} – nuber of generation/iterations

nFES {integer} – number of function evaluations

benchmark {object} – benchmark implementation object

task {Task} – task to perform optimization on

Raises:

TypeError – Raised when given benchmark function which does not exists.

See: Algorithm.setParameters(self, **kwargs)

normal(loc, scale, D=None)[source]

Get D shape random normal distributed numbers.

Arguments:

loc {} –

scale {} –

D {array} or {int} – Shape of returnd random uniform numbers

rand(D=1)[source]

Get random numbers of shape D in range from 0 to 1.

Arguments:

D {array} or {int} – Shape of return random numbers

randint(Nmax, D=1, Nmin=0, skip=[])[source]

Get D shape random full numbers in range Nmin to Nmax.

Arguments:

Nmin {integer} –

Nmax {integer} –

D {array} or {int} – Shape of returnd random uniform numbers

skip {array} – numbers to skip

run()[source]

Start the optimization.

See: Algorithm.runTask(self, taks)

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

runYield(task)[source]

Run the algorithm for only one iteration and return the gest solution.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of the best solution

setParameters(**kwargs)[source]

Set the parameters/arguments of the algorithm.

Arguments:

kwargs {dict} – Dictionary with values of the parametres

uniform(Lower, Upper, D=None)[source]

Get D shape random uniform numbers in range from Lower to Upper.

Arguments:

Lower {array} or {real} or {int} – Lower bound

Upper {array} or {real} or {int} – Upper bound

D {array} or {int} – Shape of returnd random uniform numbers

class NiaPy.algorithms.Individual(**kwargs)[source]

Bases: object

Class that represent one solution in population of solutions.

Date: 2018

Author: Klemen Berkovič

License: MIT

evaluate(task)[source]

Evaluate the solution.

Arguments:

task {Task} – Object with objective function for optimization

generateSolution(task, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]

Generate new solution.

Arguments:

task {Task}

e {bool} – Eval the solution

rnd {random} – Object for generating random numbers

repair(task)[source]

Reper solution and put the solution in the bounds of problem.

Arguments:

task {Task}

NiaPy.algorithms.basic

Implementation of basic nature-inspired algorithms.

class NiaPy.algorithms.basic.BatAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Bat algorithm.

Algorithm: Bat algorithm

Date: 2015

Authors: Iztok Fister Jr., Marko Burjek and Klemen Berkovič

License: MIT

Reference paper: Yang, Xin-She. “A new metaheuristic bat-inspired algorithm.” Nature inspired cooperative strategies for optimization (NICSO 2010). Springer, Berlin, Heidelberg, 2010. 65-74.

__init__(self, D, NP, nFES, A, r, Qmin, Qmax, benchmark).

See: Algorithm.__init__(self, **kwargs)

runTask(task)[source]

Run algorithm with initialized parameters.

Return:

{decimal} – coordinates of minimal found objective function

{decimal} – minimal value found of objective function

setParameters(NP, A, r, Qmin, Qmax, **ukwargs)[source]

Set the parameters of the algorithm.

Arguments:

NP {integer} – population size

A {decimal} – loudness

r {decimal} – pulse rate

Qmin {decimal} – minimum frequency

Qmax {decimal} – maximum frequency

class NiaPy.algorithms.basic.FireflyAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Firefly algorithm.

Algorithm: Firefly algorithm

Date: 2016

Authors: Iztok Fister Jr, Iztok Fister and Klemen Berkovič

License: MIT

Reference paper: Fister, I., Fister Jr, I., Yang, X. S., & Brest, J. (2013). A comprehensive review of firefly algorithms. Swarm and Evolutionary Computation, 13, 34-46.

alpha_new(a, alpha)[source]

Optionally recalculate the new alpha value.

getBest(xb, xb_f, Fireflies, Intensity)[source]
move_ffa(i, Fireflies, Intensity, oFireflies, alpha, task)[source]

Move fireflies.

runTask(task)[source]

Run.

setParameters(NP=20, alpha=1, betamin=1, gamma=2, **ukwargs)[source]

Set the parameters of the algorithm.

Arguments:

NP {integer} – population size

alpha {decimal} – alpha parameter

betamin {decimal} – betamin parameter

gamma {decimal} – gamma parameter

class NiaPy.algorithms.basic.DifferentialEvolutionAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Differential evolution algorithm.

Algorithm: Differential evolution algorithm

Date: 2018

Author: Uros Mlakar and Klemen Berkovič

License: MIT

Reference paper: Storn, Rainer, and Kenneth Price. “Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces.” Journal of global optimization 11.4 (1997): 341-359.

evalPopulation(x, x_old, task)[source]

Evaluate element.

runTask(task)[source]

Run.

selectBetter(x, y)[source]
setParameters(NP=25, F=2, CR=0.2, CrossMutt=<function CrossRand1>, **ukwargs)[source]

Set the algorithm parameters.

Arguments:

NP {integer} – population size

F {decimal} – scaling factor

CR {decimal} – crossover rate

CrossMutt {function} – crossover and mutation strategy

class NiaPy.algorithms.basic.FlowerPollinationAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Flower Pollination algorithm.

Algorithm: Flower Pollination algorithm

Date: 2018

Authors: Dusan Fister, Iztok Fister Jr. and Klemen Berkovič

License: MIT

Reference paper: Yang, Xin-She. “Flower pollination algorithm for global optimization. International conference on unconventional computing and natural computation. Springer, Berlin, Heidelberg, 2012. Implementation is based on the following MATLAB code: https://www.mathworks.com/matlabcentral/fileexchange/45112-flower-pollination-algorithm?requestedDomain=true

levy()[source]
repair(x, task)[source]

Find limits.

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=25, p=0.35, beta=1.5, **ukwargs)[source]

__init__(self, D, NP, nFES, p, benchmark).

Arguments:

NP {integer} – population size

p {decimal} – probability switch

beta {real} –

class NiaPy.algorithms.basic.GreyWolfOptimizer(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Grey wolf optimizer.

Algorithm: Grey wolf optimizer

Date: 2018

Author: Iztok Fister Jr. and Klemen Berkovič

License: MIT

Reference paper: Mirjalili, Seyedali, Seyed Mohammad Mirjalili, and Andrew Lewis. “Grey wolf optimizer.” Advances in engineering software 69 (2014): 46-61. Grey Wold Optimizer (GWO) source code version 1.0 (MATLAB) from MathWorks

repair(x, task)[source]

Find limits.

runTask(task)[source]

Run.

setParameters(NP=25, **ukwargs)[source]

Set the algorithm parameters.

Arguments:

NP {integer} – Number of individuals in population

class NiaPy.algorithms.basic.GeneticAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Genetic algorithm.

Algorithm: Genetic algorithm

Date: 2018

Author: Uros Mlakar and Klemen Berkovič

License: MIT

evolve(pop, x_b, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=25, Ts=5, Mr=0.25, Cr=0.25, Selection=<function TurnamentSelection>, Crossover=<function UniformCrossover>, Mutation=<function UniformMutation>, **ukwargs)[source]

Set the parameters of the algorithm.

Arguments:

NP {integer} – population size

Ts {integer} – tournament selection

Mr {decimal} – mutation rate

Cr {decimal} – crossover rate

class NiaPy.algorithms.basic.ArtificialBeeColonyAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Artificial Bee Colony algorithm.

Algorithm: Artificial Bee Colony algorithm

Date: 2018

Author: Uros Mlakar and Klemen Berkovič

License: MIT

Reference paper: Karaboga, D., and Bahriye B. “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm.” Journal of global optimization 39.3 (2007): 459-471.

__init__(self, D, NP, nFES, benchmark).

See: Algorithm.__init__(self, **kwargs)

CalculateProbs()[source]

Calculate probs.

checkForBest(Solution)[source]

Check best solution.

init(task)[source]

Initialize positions.

runTask(task)[source]

Run.

setParameters(NP=10, Limit=100, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

NP {integer} – population size

Limit {integer} – Limit

class NiaPy.algorithms.basic.ParticleSwarmAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Particle Swarm Optimization algorithm.

Algorithm: Particle Swarm Optimization algorithm

Date: 2018

Authors: Lucija Brezočnik, Grega Vrbančič, Iztok Fister Jr. and Klemen Berkovič

License: MIT

Reference paper: Kennedy, J. and Eberhart, R. “Particle Swarm Optimization”. Proceedings of IEEE International Conference on Neural Networks. IV. pp. 1942–1948, 1995.

init(task)[source]
repair(x, l, u)[source]
runTask(task)[source]

Move particles in search space.

setParameters(NP=25, C1=2.0, C2=2.0, w=0.7, vMin=-4, vMax=4, **ukwargs)[source]

Set the parameters for the algorith.

Arguments:

NP {integer} – population size

C1 {decimal} – cognitive component

C2 {decimal} – social component

w {decimal} – inertia weight

vMin {decimal} – minimal velocity

vMax {decimal} – maximal velocity

class NiaPy.algorithms.basic.BareBonesFireworksAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of bare bone fireworks algorithm.

Algorithm: Bare Bones Fireworks Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.sciencedirect.com/science/article/pii/S1568494617306609

Reference paper: Junzhi Li, Ying Tan, The bare bones fireworks algorithm: A minimalist global optimizer, Applied Soft Computing, Volume 62, 2018, Pages 454-462, ISSN 1568-4946, https://doi.org/10.1016/j.asoc.2017.10.046.

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(n=10, C_a=1.5, C_r=0.5, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

n {integer} – number of sparks $in [1, infty)$

C_a {real} – amplification coefficient $in [1, infty)$

C_r {real} – reduction coefficient $in (0, 1)$

class NiaPy.algorithms.basic.CamelAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Camel traveling behavior.

Algorithm: Camel algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.iasj.net/iasj?func=fulltext&aId=118375

Reference paper: Ali, Ramzy. (2016). Novel Optimization Algorithm Inspired by Camel Traveling Behavior. Iraq J. Electrical and Electronic Engineering. 12. 167-177.

lifeCycle(c, fit, fitn, mu, task)[source]
oasis(c, rn, fit, fitn, alpha)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=50, omega=0.25, mu=0.5, alpha=0.5, S_init=10, E_init=10, T_min=-10, T_max=10, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

NP {integer} – population size $in [1, infty)$

T_min {real} – minimum temperature, must be true $T_{min} < T_{max}$

T_max {real} – maximum temperature, must be true $T_{min} < T_{max}$

omega {real} – burden factor $in [0, 1]$

mu {real} – dying rate $in [0, 1]$

S_init {real} – initial supply $in (0, infty)$

E_init {real} – initial endurance $in (0, infty)$

walk(c, fit, task, omega, c_best)[source]
class NiaPy.algorithms.basic.MonkeyKingEvolutionV1(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of monkey king evolution algorithm version 1.

Algorithm: Monkey King Evolution version 1

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.sciencedirect.com/science/article/pii/S0950705116000198

Reference paper: Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.

moveMK(x, task)[source]
moveMokeyKingPartice(p, task)[source]
moveP(x, x_pb, x_b, task)[source]
movePartice(p, p_b, task)[source]
movePopulation(pop, p_b, task)[source]
repair(x, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=40, F=0.7, R=0.3, C=3, FC=0.5, **ukwargs)[source]

Set the algorithm parameters.

Arguments:

NP {integer} – Size of population

F {real} – param

R {real} – param

C {real} – param

FC {real} – param

class NiaPy.algorithms.basic.MonkeyKingEvolutionV2(**kwargs)[source]

Bases: NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1

Implementation of monkey king evolution algorithm version 2.

Algorithm: Monkey King Evolution version 2

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.sciencedirect.com/science/article/pii/S0950705116000198

Reference paper: Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.

moveMK(x, dx, task)[source]
moveMokeyKingPartice(p, pop, task)[source]
movePopulation(pop, p_b, task)[source]
class NiaPy.algorithms.basic.MonkeyKingEvolutionV3(**kwargs)[source]

Bases: NiaPy.algorithms.basic.mke.MonkeyKingEvolutionV1

Implementation of monkey king evolution algorithm version 3.

Algorithm: Monkey King Evolution version 3

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.sciencedirect.com/science/article/pii/S0950705116000198

Reference paper: Zhenyu Meng, Jeng-Shyang Pan, Monkey King Evolution: A new memetic evolutionary algorithm and its application in vehicle fuel consumption optimization, Knowledge-Based Systems, Volume 97, 2016, Pages 144-157, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2016.01.009.

eval(X, x, x_f, task)[source]
neg(x)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

class NiaPy.algorithms.basic.EvolutionStrategy1p1(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of (1 + 1) evolution strategy algorithm. Uses just one individual.

Algorithm: (1 + 1) Evolution Strategy Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL:

Reference paper:

mutate(x, rho)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(mu=1, k=10, c_a=1.1, c_r=0.5, epsilon=1e-20, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

mu {integer} –

k {integer} –

c_a {real} –

c_r {real} –

updateRho(rho, k)[source]
class NiaPy.algorithms.basic.EvolutionStrategyMp1(**kwargs)[source]

Bases: NiaPy.algorithms.basic.es.EvolutionStrategy1p1

Implementation of (mu + 1) evolution strategy algorithm. Algorithm creates mu mutants but into new generation goes only one individual.

Algorithm: ($mu$ + 1) Evolution Strategy Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL:

Reference paper:

setParameters(**kwargs)[source]

Set the arguments of an algorithm.

Arguments:

mu {integer} –

k {integer} –

c_a {real} –

c_r {real} –

class NiaPy.algorithms.basic.EvolutionStrategyMpL(**kwargs)[source]

Bases: NiaPy.algorithms.basic.es.EvolutionStrategy1p1

Implementation of (mu + lambda) evolution strategy algorithm. Mulation creates lambda individual. Lambda individual compete with mu individuals for survival, so only mu individual go to new generation.

Algorithm: ($mu$ + $lambda$) Evolution Strategy Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL:

Reference paper:

changeCount(a, b)[source]
mutate(x, rho)[source]
mutateRepair(pop, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(**kwargs)[source]

Set the arguments of an algorithm.

Arguments:

mu {integer} –

k {integer} –

c_a {real} –

c_r {real} –

updateRho(pop, k)[source]
class NiaPy.algorithms.basic.EvolutionStrategyML(**kwargs)[source]

Bases: NiaPy.algorithms.basic.es.EvolutionStrategyMpL

Implementation of (mu, lambda) evolution strategy algorithm. Algorithm is good for dynamic environments. Mu individual create lambda chields. Only best mu chields go to new generation. Mu parents are discarded.

Algorithm: ($mu$ + $lambda$) Evolution Strategy Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL:

Reference paper:

newPop(pop)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

class NiaPy.algorithms.basic.SineCosineAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of sine cosine algorithm.

Algorithm: Sine Cosine Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.sciencedirect.com/science/article/pii/S0950705115005043

Reference paper: Seyedali Mirjalili, SCA: A Sine Cosine Algorithm for solving optimization problems, Knowledge-Based Systems, Volume 96, 2016, Pages 120-133, ISSN 0950-7051, https://doi.org/10.1016/j.knosys.2015.12.022.

nextPos(x, x_b, r1, r2, r3, r4, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=25, a=3, Rmin=0, Rmax=2, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

NP {integer} – number of individual in population

a {real} – parameter for controlon $r_1$ value

Rmin {integer} – minium value for $r_3$ value

Rmax {integer} – maximum value for $r_3$ value

class NiaPy.algorithms.basic.GlowwormSwarmOptimization(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of glowwarm swarm optimization.

Algorithm: Glowwarm Swarm Optimization Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.springer.com/gp/book/9783319515946

Reference paper: Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.

calcLuciferin(L, GS_f)[source]
getBest(GS, GS_f, xb, xb_f)[source]
getNeighbors(i, r, GS, L)[source]
moveSelect(pb, i)[source]
probabilityes(i, N, L)[source]
randMove(i)[source]
rangeUpdate(R, N, rs)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(n=25, l0=5, nt=5, rho=0.4, gamma=0.6, beta=0.08, s=0.03, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

n {integer} – number of glowworms in population

l0 {real} – initial luciferin quantity for each glowworm

nt {real} –

rs {real} – maximum sensing range

rho {real} – luciferin decay constant

gamma {real} – luciferin enhancement constant

beta {real} –

s {real} –

class NiaPy.algorithms.basic.GlowwormSwarmOptimizationV1(**kwargs)[source]

Bases: NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization

Implementation of glowwarm swarm optimization.

Algorithm: Glowwarm Swarm Optimization Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.springer.com/gp/book/9783319515946

Reference paper: Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.

calcLuciferin(L, GS_f)[source]
rangeUpdate(R, N, rs)[source]
setParameters(**kwargs)[source]

Set the arguments of an algorithm.

Arguments:

n {integer} – number of glowworms in population

l0 {real} – initial luciferin quantity for each glowworm

nt {real} –

rs {real} – maximum sensing range

rho {real} – luciferin decay constant

gamma {real} – luciferin enhancement constant

beta {real} –

s {real} –

class NiaPy.algorithms.basic.GlowwormSwarmOptimizationV2(**kwargs)[source]

Bases: NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization

Implementation of glowwarm swarm optimization.

Algorithm: Glowwarm Swarm Optimization Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.springer.com/gp/book/9783319515946

Reference paper: Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.

rangeUpdate(P, N, rs)[source]
setParameters(**kwargs)[source]

Set the arguments of an algorithm.

Arguments:

n {integer} – number of glowworms in population

l0 {real} – initial luciferin quantity for each glowworm

nt {real} –

rs {real} – maximum sensing range

rho {real} – luciferin decay constant

gamma {real} – luciferin enhancement constant

beta {real} –

s {real} –

class NiaPy.algorithms.basic.GlowwormSwarmOptimizationV3(**kwargs)[source]

Bases: NiaPy.algorithms.basic.gso.GlowwormSwarmOptimization

Implementation of glowwarm swarm optimization.

Algorithm: Glowwarm Swarm Optimization Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.springer.com/gp/book/9783319515946

Reference paper: Kaipa, Krishnanand N., and Debasish Ghose. Glowworm swarm optimization: theory, algorithms, and applications. Vol. 698. Springer, 2017.

rangeUpdate(R, N, rs)[source]
setParameters(**kwargs)[source]

Set the arguments of an algorithm.

Arguments:

n {integer} – number of glowworms in population

l0 {real} – initial luciferin quantity for each glowworm

nt {real} –

rs {real} – maximum sensing range

rho {real} – luciferin decay constant

gamma {real} – luciferin enhancement constant

beta {real} –

s {real} –

class NiaPy.algorithms.basic.HarmonySearch(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of harmony search algorithm.

Algorithm: Harmony Search Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://link.springer.com/chapter/10.1007/978-3-642-00185-7_1

Reference paper: Yang, Xin-She. “Harmony search as a metaheuristic algorithm.” Music-inspired harmony search algorithm. Springer, Berlin, Heidelberg, 2009. 1-14.

adjustment(x, task)[source]
bw(task)[source]
improvize(HM, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(HMS=30, r_accept=0.7, r_pa=0.35, b_range=1.42, **ukwargs)[source]

Set the arguments of the algorithm.

Arguments:

HMS {integer} – Number of harmonys in the memory

r_accept {real} –

r_pa {real} –

b_range {real} –

class NiaPy.algorithms.basic.HarmonySearchV1(**kwargs)[source]

Bases: NiaPy.algorithms.basic.hs.HarmonySearch

Implementation of harmony search algorithm.

Algorithm: Harmony Search Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://link.springer.com/chapter/10.1007/978-3-642-00185-7_1

Reference paper: Yang, Xin-She. “Harmony search as a metaheuristic algorithm.” Music-inspired harmony search algorithm. Springer, Berlin, Heidelberg, 2009. 1-14.

bw(task)[source]
setParameters(bw_min=1, bw_max=2, **kwargs)[source]

Set the parameters of the algorithm.

Arguments:

bw_min {real} – Minimal bandwidth

bw_max {real} – Maximal bandwidth

class NiaPy.algorithms.basic.KrillHerdV1(**kwargs)[source]

Bases: NiaPy.algorithms.basic.kh.KrillHerd

Implementation of krill herd algorithm.

Algorithm: Krill Herd Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://www.sciencedirect.com/science/article/pii/S1007570412002171

Reference paper: Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.

crossover(x, xo, Cr)[source]
mutate(x, x_b, Mu)[source]
class NiaPy.algorithms.basic.KrillHerdV2(**kwargs)[source]

Bases: NiaPy.algorithms.basic.kh.KrillHerd

Implementation of krill herd algorithm.

Algorithm: Krill Herd Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://www.sciencedirect.com/science/article/pii/S1007570412002171

Reference paper: Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.

mutate(x, x_b, Mu)[source]
class NiaPy.algorithms.basic.KrillHerdV3(**kwargs)[source]

Bases: NiaPy.algorithms.basic.kh.KrillHerd

Implementation of krill herd algorithm.

Algorithm: Krill Herd Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://www.sciencedirect.com/science/article/pii/S1007570412002171

Reference paper: Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.

crossover(x, xo, Cr)[source]
class NiaPy.algorithms.basic.KrillHerdV4(**kwargs)[source]

Bases: NiaPy.algorithms.basic.kh.KrillHerd

Implementation of krill herd algorithm.

Algorithm: Krill Herd Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://www.sciencedirect.com/science/article/pii/S1007570412002171

Reference paper: Amir Hossein Gandomi, Amir Hossein Alavi, Krill herd: A new bio-inspired optimization algorithm, Communications in Nonlinear Science and Numerical Simulation, Volume 17, Issue 12, 2012, Pages 4831-4845, ISSN 1007-5704, https://doi.org/10.1016/j.cnsns.2012.05.010.

setParameters(NP=50, N_max=0.01, V_f=0.02, D_max=0.002, C_t=0.93, W_n=0.42, W_f=0.38, d_s=2.63, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

NP {integer} – Number of krill herds in population

N_max {real} – maximum induced speed

V_f {real} – foraging speed

D_max {real} – maximum diffsion speed

C_t {real} – constant $in [0, 2]$

W_n {real} or {array} – inerta weights of the motion iduced from neighbors $in [0, 1]$

W_f {real} or {array} – inerta weights of the motion iduced from fraging $in [0, 1]$

d_s {real} – maximum euclidean distance for neighbors

nn {integer} – maximu neighbors for neighbors effect

Cr {real} – Crossover rate

Mu {real} – Mutation rate

epsilon {real} – Small numbers for devision

class NiaPy.algorithms.basic.KrillHerdV11(**kwargs)[source]

Bases: NiaPy.algorithms.basic.kh.KrillHerd

Implementation of krill herd algorithm.

Algorithm: Krill Herd Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL:

Reference paper:

Cr(KH_f, KHb_f, KHw_f)[source]
ElitistSelection(KH, KH_f, KHo, KHo_f)[source]
Foraging(KH, KH_f, KHo, KHo_f, W_f, F, KH_wf, KH_bf, x_food, x_food_f, task)[source]
Neighbors(i, KH, KH_f, iw, ib, N, W_n, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

class NiaPy.algorithms.basic.FireworksAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of fireworks algorithm.

Algorithm: Fireworks Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://www.springer.com/gp/book/9783662463529

Reference paper: Tan, Ying. “Firework Algorithm: A Novel Swarm Intelligence Optimization Method.” (2015).

ExplodeSpark(x, A, task)[source]
ExplosionAmplitude(x_f, xb_f, A, As)[source]
GaussianSpark(x, task)[source]
Mapping(x, task)[source]
NextGeneration(FW, FW_f, FWn, task)[source]
R(x, FW)[source]
SparsksNo(x_f, xw_f, Ss)[source]
initAmplitude(task)[source]
p(r, Rs)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(N=40, m=40, a=1, b=2, A=40, epsilon=1e-31, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

N {integer} – number of Fireworks

m {integer} – number of sparks

a {integer} – limitation of sparks

b {integer} – limitation of sparks

A {real} –

epsilon {real} – Small number for non 0 devision

class NiaPy.algorithms.basic.EnhancedFireworksAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.basic.fwa.FireworksAlgorithm

Implementation of enganced fireworks algorithm.

Algorithm: Enhanced Fireworks Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://ieeexplore.ieee.org/document/6557813/

Reference paper: S. Zheng, A. Janecek and Y. Tan, “Enhanced Fireworks Algorithm,” 2013 IEEE Congress on Evolutionary Computation, Cancun, 2013, pp. 2069-2077. doi: 10.1109/CEC.2013.6557813

ExplosionAmplitude(x_f, xb_f, A_min, Ah, As, task)[source]
GaussianSpark(x, xb, task)[source]
NextGeneration(FW, FW_f, FWn, task)[source]
initRanges(task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(Ainit=20, Afinal=5, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

N {integer} – number of Fireworks

m {integer} – number of sparks

a {integer} – limitation of sparks

b {integer} – limitation of sparks

A {real} –

epsilon {real} – Small number for non 0 devision

uAmin(Ainit, Afinal, task)[source]
class NiaPy.algorithms.basic.DynamicFireworksAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.basic.fwa.DynamicFireworksAlgorithmGauss

Implementation of dynamic fireworks algorithm.

Algorithm: Dynamic Fireworks Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223

Reference paper: S. Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

class NiaPy.algorithms.basic.DynamicFireworksAlgorithmGauss(**kwargs)[source]

Bases: NiaPy.algorithms.basic.fwa.EnhancedFireworksAlgorithm

Implementation of dynamic fireworks algorithm.

Algorithm: Dynamic Fireworks Algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6900485&isnumber=6900223

Reference paper: S. Zheng, A. Janecek, J. Li and Y. Tan, “Dynamic search in fireworks algorithm,” 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, 2014, pp. 3222-3229. doi: 10.1109/CEC.2014.6900485

ExplosionAmplitude(x_f, xb_f, A, As)[source]
Mapping(x, task)[source]
NextGeneration(FW, FW_f, FWn, task)[source]

Elitism Random Selection.

initAmplitude(task)[source]
repair(x, d, epsilon)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(A_cf=20, C_a=1.2, C_r=0.9, epsilon=1e-08, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

N {integer} – number of Fireworks

m {integer} – number of sparks

a {integer} – limitation of sparks

b {integer} – limitation of sparks

A {real} –

epsilon {real} – Small number for non 0 devision

uCF(xnb, xcb, xcb_f, xb, xb_f, Acf, task)[source]
class NiaPy.algorithms.basic.GravitationalSearchAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of gravitational search algorithm.

Algorithm: Gravitational Search Algorithm

Date: 2018

Author: Klemen Berkoivč

License: MIT

Reference URL: https://doi.org/10.1016/j.ins.2009.03.004

Reference paper: Esmat Rashedi, Hossein Nezamabadi-pour, Saeid Saryazdi, GSA: A Gravitational Search Algorithm, Information Sciences, Volume 179, Issue 13, 2009, Pages 2232-2248, ISSN 0020-0255

G(t)[source]
d(x, y, ln=2)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=40, G_0=2.467, epsilon=1e-17, **ukwargs)[source]

Set the algorithm parameters.

Arguments:

NP {integer} – number of planets in population

G_0 {real} – starting gravitational constant

NiaPy.algorithms.modified

Implementation of modified nature-inspired algorithms.

class NiaPy.algorithms.modified.HybridBatAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.basic.ba.BatAlgorithm

Implementation of Hybrid bat algorithm.

Algorithm: Hybrid bat algorithm

Date: 2018

Author: Grega Vrbancic and Klemen Berkovič

License: MIT

Reference paper: Fister Jr., Iztok and Fister, Dusan and Yang, Xin-She. “A Hybrid Bat Algorithm”. Elektrotehniski vestnik, 2013. 1-7.

runTask(task)[source]

Run algorithm with initialized parameters.

Return:

{decimal} – coordinates of minimal found objective function

{decimal} – minimal value found of objective function

setParameters(**kwargs)[source]

Set the parameters of the algorithm.

Arguments:

NP {integer} – population size

A {decimal} – loudness

r {decimal} – pulse rate

Qmin {decimal} – minimum frequency

Qmax {decimal} – maximum frequency

class NiaPy.algorithms.modified.SelfAdaptiveDifferentialEvolutionAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.basic.de.DifferentialEvolutionAlgorithm

Implementation of Self-adaptive differential evolution algorithm.

Algorithm: Self-adaptive differential evolution algorithm

Date: 2018

Author: Uros Mlakar and Klemen Berkvoič

License: MIT

Reference paper: Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V. Self-adapting control parameters in differential evolution: A comparative study on numerical benchmark problems. IEEE transactions on evolutionary computation, 10(6), 646-657, 2006.

AdaptiveGen(x)[source]
runTask(task)[source]

Run.

setParameters(F_l=0.0, F_u=2.0, Tao1=0.4, Tao2=0.6, **ukwargs)[source]

Set the parameters of an algorithm.

Arguments:

F_l {decimal} – scaling factor lower limit

F_u {decimal} – scaling factor upper limit

Tao1 {decimal} – change rate for F parameter update

Tao2 {decimal} – change rate for CR parameter update

class NiaPy.algorithms.modified.DynNPSelfAdaptiveDifferentialEvolutionAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.modified.jde.SelfAdaptiveDifferentialEvolutionAlgorithm

Implementation of Dynamic population size self-adaptive differential evolution algorithm.

Algorithm: Dynamic population size self-adaptive differential evolution algorithm

Date: 2018

Author: Jan Popič

License: MIT

Reference URL: https://link.springer.com/article/10.1007/s10489-007-0091-x

Reference paper: Brest, Janez, and Mirjam Sepesy Maučec. “Population size reduction for the differential evolution algorithm.” Applied Intelligence 29.3 (2008): 228-247.

AdaptiveGen(x)[source]
runTask(task)[source]

Run.

setParameters(rp=0, pmax=4, **ukwargs)[source]

Set the parameters of an algorithm.

Arguments:

rp {integer} – small non-negative number which is added to value of genp (if it’s not divisible)

pmax {integer} – number of population reductions

NiaPy.algorithms.other

Implementation of basic nature-inspired algorithms.

class NiaPy.algorithms.other.NelderMeadMethod(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Nelder Mead method or downhill simplex method or amoeba method.

Algorithm: Nelder Mead Method

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference URL: https://en.wikipedia.org/wiki/Nelder%E2%80%93Mead_method

init(task)[source]
method(X, X_f, task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(alpha=1.0, gamma=2.0, rho=-0.5, sigma=0.5, **ukwargs)[source]

Set the arguments of an algorithm.

Arguments:

alpha {real} – Reflection coefficient parameter

gamma {real} – Expansion coefficient parameter

rho {real} – Contraction coefficient parameter

sigma {real} – Shrink coefficient parameter

class NiaPy.algorithms.other.HillClimbAlgorithm(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of iterative hill climbing algorithm.

Algorithm: Hill Climbing Algorithm

Date: 2018

Authors: Jan Popič

License: MIT

Reference URL:

Reference paper:

Initialize Iterative Hillclimb algorithm class.

See: Algorithm.__init__(self, **kwargs)

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(delta=0.5, Neighborhood=<function Neighborhood>, **ukwargs)[source]

Set the algorithm parameters/arguments.

See: HillClimbAlgorithm.__setparams(self, delta=0.5, Neighborhood=Neighborhood, **ukwargs)

class NiaPy.algorithms.other.SimulatedAnnealing(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Simulated Annealing Algorithm.

Algorithm: Simulated Annealing Algorithm

Date: 2018

Authors: Jan Popič

License: MIT

Reference URL:

Reference paper:

Init Simulated Annealing Algorithm.

See: Algorithm.__init__(self, **kwargs)

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(delta=0.5, T=20, deltaT=0.8, coolingMethod=<function coolDelta>, **ukwargs)[source]

Set the algorithm parameters/arguments.

See: SimulatedAnnealing.__setparams(self, n=10, c_a=1.5, c_r=0.5, **ukwargs)

class NiaPy.algorithms.other.MultipleTrajectorySearch(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

BONUS1 = 10

Implementation of Multiple trajectory search.

Algorithm: Multiple trajectory search

Date: 2018

Authors: Klemen Berkovic

License: MIT

Reference URL: https://ieeexplore.ieee.org/document/4631210/

Reference paper: Lin-Yu Tseng and Chun Chen, “Multiple trajectory search for Large Scale Global Optimization,” 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, 2008, pp. 3052-3059. doi: 10.1109/CEC.2008.4631210

BONUS2 = 1

Implementation of Multiple trajectory search.

Algorithm: Multiple trajectory search

Date: 2018

Authors: Klemen Berkovic

License: MIT

Reference URL: https://ieeexplore.ieee.org/document/4631210/

Reference paper: Lin-Yu Tseng and Chun Chen, “Multiple trajectory search for Large Scale Global Optimization,” 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), Hong Kong, 2008, pp. 3052-3059. doi: 10.1109/CEC.2008.4631210

GradingRun(x, x_f, xb, xb_f, improve, SR, task)[source]
LsRun(k, x, x_f, xb, xb_f, improve, SR, g, task)[source]
getBest(X, X_f)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=40, NoLsTests=5, NoLs=5, NoLsBest=5, NoEnabled=17, **ukwargs)[source]

Set the arguments of the algorithm.

Arguments:

NP, M {integer} – population size

NoLsTests {integer} – number of test runs on local search algorihms

NoLs {integer} – number of local search algoritm runs

NoLsBest {integer} – number of locals search algorithm runs on best solution

NoEnabled {integer} – number of best solution for testing

class NiaPy.algorithms.other.MultipleTrajectorySearchV1(**kwargs)[source]

Bases: NiaPy.algorithms.other.mts.MultipleTrajectorySearch

Implementation of Multiple trajectory search.

Algorithm: Multiple trajectory search

Date: 2018

Authors: Klemen Berkovic

License: MIT

Reference URL: https://ieeexplore.ieee.org/document/4983179/

Reference paper: Tseng, Lin-Yu, and Chun Chen. “Multiple trajectory search for unconstrained/constrained multi-objective optimization.” Evolutionary Computation, 2009. CEC‘09. IEEE Congress on. IEEE, 2009.

runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

NiaPy.algorithms.other.MTS_LS1(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]
NiaPy.algorithms.other.MTS_LS2(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]
NiaPy.algorithms.other.MTS_LS3(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]
NiaPy.algorithms.other.MTS_LS1v1(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]
NiaPy.algorithms.other.MTS_LS3v1(Xk, Xk_fit, Xb, Xb_fit, improve, SR, task, phi=3, BONUS1=10, BONUS2=1, rnd=<module 'numpy.random' from '/home/docs/checkouts/readthedocs.org/user_builds/niapy/envs/stable/lib/python3.6/site-packages/numpy/random/__init__.py'>)[source]
class NiaPy.algorithms.other.AnarchicSocietyOptimization(**kwargs)[source]

Bases: NiaPy.algorithms.algorithm.Algorithm

Implementation of Anarchic Society Optimization algorithm.

Algorithm: Particle Swarm Optimization algorithm

Date: 2018

Authors: Klemen Berkovič

License: MIT

Reference paper: Ahmadi-Javid, Amir. “Anarchic Society Optimization: A human-inspired method.” Evolutionary Computation (CEC), 2011 IEEE Congress on. IEEE, 2011.

EI(x_f, xnb_f, gamma)[source]

Get external irregularity index.

FI(x_f, xpb_f, xb_f, alpha)[source]

Get fickleness index.

II(x_f, xpb_f, theta)[source]

Get internal irregularity index.

getBestNeighbors(i, X, X_f, rs)[source]
init(task)[source]
runTask(task)[source]

Start the optimization.

Arguments:

task {Task} – Task with bounds and objective function for optimization

Return:

solution {array} – point of best solution

fitness {real} – fitness value of best solution

setParameters(NP=43, alpha=[1, 0.83], gamma=[1.17, 0.56], theta=[0.932, 0.832], d=<function euclidean>, dn=<function euclidean>, nl=1, F=1.2, CR=0.25, Combination=<function Elitism>, **ukwargs)[source]

Set the parameters for the algorith.

Arguments:

NP {integer} – population size

alpha {array} – factor for fickleness index function $in [0, 1]$

gamma {array} – factor for external irregularity index function $in [0, infty)$

theta {array} – factor for internal irregularity index function $in [0, infty)$

d {function} – function that takes two arguments that are function values and calcs the distance between them

dn {function} – function that takes two arguments that are points in function landscape and calcs the distance between them

nl {real} – normalized range for neighborhood search $in (0, 1]$

F {real} – mutation parameter

CR {real} – crossover parameter $in [0, 1]$

Combination {function} – Function that combines movment strategies

uBestAndPBest(X, X_f, Xpb, Xpb_f)[source]