Global optimization | Wikipedia audio article
This is an audio version of the Wikipedia Article:
https://en.wikipedia.org/wiki/Global_optimization
00:02:52 1 General theory
00:12:16 2 Applications
00:13:42 3 Deterministic methods
00:13:56 3.1 Inner and outer approximation
00:14:21 3.2 Cutting-plane methods
00:15:02 3.3 Branch and bound methods
00:15:47 3.4 Interval methods
00:16:24 3.5 Methods based on real algebraic geometry
00:16:58 4 Stochastic methods
00:17:13 4.1 Direct Monte-Carlo sampling
00:18:33 4.2 Stochastic tunneling
00:19:06 4.3 Parallel tempering
00:20:35 5 Heuristics and metaheuristics
00:22:02 6 Response surface methodology-based approaches
00:22:28 7 See also
Listening is a more natural way of learning, when compared to reading. Written language only began at around 3200 BC, but spoken language has existed long ago.
Learning by listening is a great way to:
- increases imagination and understanding
- improves your listening skills
- improves your own spoken accent
- learn while on the move
- reduce eye strain
Now learn the vast amount of general knowledge available on Wikipedia through audio (audio article). You could even learn subconsciously by playing the audio while you are sleeping! If you are planning to listen a lot, you could try using a bone conduction headphone, or a standard speaker instead of an earphone.
Listen on Google Assistant through Extra Audio:
https://assistant.google.com/services/invoke/uid/0000001a130b3f91
Other Wikipedia audio articles at:
https://www.youtube.com/results?search_query=wikipedia+tts
Upload your own Wikipedia articles through:
https://github.com/nodef/wikipedia-tts
Speaking Rate: 0.9936861050535563
Voice name: en-US-Wavenet-B
"I cannot teach anybody anything, I can only make them think."
- Socrates
SUMMARY
=======
Global optimization is a branch of applied mathematics and numerical analysis that attempts to find the global minima or maxima of a function or a set of functions on a given set. It is usually described as a minimization problem because the maximization of the real-valued function
g
(
x
)
{\displaystyle g(x)}
is obviously equivalent to the minimization of the function
f
(
x
)
:=
(
−
1
)
⋅
g
(
x
)
{\displaystyle f(x):=(-1)\cdot g(x)}
.
Given a possibly nonlinear and non-convex continuous function
f
:
Ω
⊂
R
n
→
R
{\displaystyle f:\Omega \subset \mathbb {R} ^{n}\to \mathbb {R} }
with the global minima
f
∗
{\displaystyle f^{*}}
and the set of all global minimizers
X
∗
{\displaystyle X^{*}}
in
Ω
{\displaystyle \Omega }
, the standard minimization problem can be given as
min
x
∈
Ω
f
(
x
)
,
{\displaystyle \min _{x\in \Omega }f(x),}
that is, finding
f
∗
{\displaystyle f^{*}}
and a global minimizer in
X
∗
{\displaystyle X^{*}}
; where
Ω
{\displaystyle \Omega }
is a (not necessarily convex) compact set defined by inequalities
g
i
(
x
)
⩾
0
,
i
=
1
,
…
,
r
{\displaystyle g_{i}(x)\geqslant 0,i=1,\ldots ,r}
.
Global optimization is distinguished from local optimization by its focus on finding the minima or maxima over the given set, as opposed to finding local minima or maxima. Finding an arbitrary local minima is relatively straightforward by using classical local optimization methods. Finding the global minima of a function is far more difficult: analytical methods are frequently not applicable, and the use of numerical solution strategies often leads to very hard challenges.