A New Three Object Triangulation Algorithm for Mobile Robot Positioning
Vincent Pierlot and Marc Van Droogenbroeck
INTELSIG, Laboratory for Signal and Image Exploitation
Montefiore Institute, University of Liège, Belgium
Scientific paper, referenced
[47], available in PDF:
here or
here
Keywords: mobile robot positioning, triangulation, algorithms, benchmarking, software,
C source code,
documentation, ToTal algorithm
Executive summary: the ToTal triangution algorithm.
Given three angles α_{1}, α_{2}, α_{3}, find the location of the observer {x_{R}, y_{R}}
Given the three beacon coordinates
{x_{1}, y_{1}},
{x_{2}, y_{2}},
{x_{3}, y_{3}}, and the three angles
α_{1}, α_{2}, α_{3}:

compute the modified beacon coordinates:
x’_{1} = x_{1} − x_{2}, y’_{1} = y_{1} − y_{2}, x’_{3} = x_{3} − x_{2}, y’_{3} = y_{3} − y_{2},

compute the three cot(.):
T_{12} = cot(α_{2} − α_{1}), T_{23} = cot(α_{3} − α_{2}), T_{31} = (1 − T_{12}T_{23})/(T_{12} + T_{23})

compute the modified circle center coordinates:
x’_{12} = x’_{1} + T_{12} y’_{1}, y’_{12} = y’_{1} − T_{12} x’_{1},
x’_{23} = x’_{3} − T_{23} y’_{3}, y’_{23} = y’_{3} + T_{23} x’_{3},
x’_{31} = (x’_{3} + x’_{1}) + T_{31} (y’_{3} − y’_{1}),
y’_{31} = (y’_{3} + y’_{1}) − T_{31} (x’_{3} − x’_{1}),

compute k’_{31}:
k’_{31} = x’_{1}x’_{3} + y’_{1}y’_{3} + T_{31}(x’_{1}y’_{3} − x’_{3}y’_{1}),

compute D (if D = 0, return with an error):
D = (x’_{12} − x’_{23})(y’_{23} − y’_{31}) − (y’_{12} − y’_{23})(x’_{23} − x’_{31}),

compute the robot position {x_{R}, y_{R}} and return:
x_{R} = x_{2} + (k’_{31}(y’_{12} − y’_{23}))/(D), y_{R} = y_{2} + (k’_{31}(x’_{23} − x’_{12}))/(D).
Abstract
Positioning is a fundamental issue in mobile robot applications. It can be achieved in many ways. Among them, triangulation based on angles measured with the help of beacons is a proven technique. Most of the many triangulation algorithms proposed so far have major limitations. For example, some of them need a particular beacon ordering, have blind spots, or only work within the triangle defined by the three beacons. More reliable methods exist; however, they have an increasing complexity or they require to handle certain spatial arrangements separately.
In this paper, we present a simple and new three object triangulation algorithm, named ToTal, that natively works in the whole plane, and for any beacon ordering. We also provide a comprehensive comparison between many algorithms, and show that our algorithm is faster and simpler than comparable algorithms. In addition to its inherent efficiency, our algorithm provides a very useful and unique reliability measure that is assessable anywhere in the plane, which can be used to identify pathological cases, or as a validation gate in Kalman filters.
1 Introduction
Positioning is a fundamental issue in mobile robot applications. Indeed, in most cases, a mobile robot that moves in its environment has to position itself before it can execute its actions properly. Therefore the robot has to be equipped with some hardware and software capable to provide a sensory feedback related to its environment
[36]. Positioning methods can be classified into two main groups
[23]: (1)
relative positioning (also called
dead
reckoning) and (2)
absolute positioning (or referencebased). One of the most famous relative positioning technique is the odometry, which consists of counting the number of wheel revolutions to compute the offset relative to a known position. It is very accurate for small offsets but is not sufficient because of the unbounded accumulation of errors over time (because of wheel slippage, imprecision in the wheel circumference, or wheel base)
[23]. Furthermore odometry needs an initial position and fails when the robot is “wakenup” (after a forced reset for example) or is raised and dropped somewhere, since the reference position is unknown or modified. An absolute positioning system is thus required to recalibrate the robot position periodically.
Relative and absolute positioning are complementary to each other
[36, 25] and are typically merged together by using a Kalman filter
[30, 18]. In many cases, absolute positioning is ensured by beaconbased triangulation or trilateration.
Triangulation is the process to determine the robot pose (position and orientation) based on angle measurements, while
trilateration methods involve the determination of the robot position based on distance measurements. Because of the availability of angle measurement systems, triangulation has emerged as a widely used, robust, accurate, and flexible technique
[27]. Another advantage of triangulation versus trilateration is that the robot can compute its orientation (or heading) in addition to its position, so that the complete
pose of the robot can be found. The process of determining the robot pose based on angle measurements is generally termed triangulation. The word triangulation is a wide concept, which does not specify if the angles are measured from the robot or the beacons, nor the number of angles used. In this paper, we are interested in self position determination, meaning that the angles are measured from the robot location. Figure
1↓ illustrates our triangulation setup.
Moreover, if only three beacons are used in self position determination, triangulation is also termed
Three Object Triangulation by Cohen and Koss
[5]. Here the general term
object refers to a 2D point, whose location is known.
Our algorithm, named
ToTal hereafter, has already been presented in a previous paper
[46]. In this paper, we supplement our previous work with an extensive review about the triangulation topics, detail the implementation of our algorithm, and compare our algorithm with seventeen other similar algorithms. Please note that the C source code implementation, developed for the error analysis and benchmarks, is made available to the scientific community.
The paper is organized as follows. Section
2↓ reviews some of the numerous triangulation algorithms found in the literature. Our new three object triangulation algorithm is described in Section
3↓. Section
4↓ presents simulation results and benchmarks. Then, we conclude the paper in Section
5↓.
2 Related Work
2.1 Triangulation algorithms
The principle of triangulation has existed for a long time, and many methods have been proposed so far. One of the first comprehensive reviewing work has been carried out by Cohen and Koss
[5]. In their paper, they classify the triangulation algorithms into four groups: (1)
Geometric Triangulation, (2)
Geometric Circle Intersection, (3)
Iterative methods (Iterative Search, NewtonRaphson, etc.), and (4)
Multiple Beacons Triangulation.
The first group could be named Trigonometric Triangulation, because it makes an intensive use of trigonometric functions. Algorithms of the second group determine the parameters (radius and center) of two (of the three) circles passing through the beacons and the robot, then they compute the intersection between these two circles. Methods of the first and second groups are typically used to solve the three object triangulation problem. The third group linearizes the trigonometric relations to converge to the robot position after some iterations, from a starting point (usually the last known robot position). In the iterative methods, they also present Iterative Search, which consists in searching the robot position through the possible space of orientations, and by using a closeness measure of a solution. The fourth group addresses the more general problem of finding the robot pose from more than three angle measurements (usually corrupted by errors), which is an overdetermined problem.
Several authors have noticed that the second group (
Geometric Circle Intersection) is the most popular for solving the three object triangulation problem
[29, 9]. The oldest Geometric Circle Intersection algorithm was described by McGillem and Rappaport
[10, 9]. FontLlagunes and Battle
[29] present a very similar method, but they first change the reference frame so to relocate beacon 2 at the origin and beacon 3 on the X axis. They compute the robot position in this reference frame and then, they apply a rotation and translation to return to the original reference frame. Zalama
et al. [16, 15] present a hardware system to measure angles to beacons and a method to compute the robot pose from three angle measurements. A similar hardware system and method based on
[16] and
[15] is described by Tsukiyama
[45]. Kortenkamp
[12] presents a method which turns out to be exactly the same as the one described by Cohen and Koss
[5]. All these methods compute the intersection of two of the three circles passing through the beacons and the robot. It appears that they are all variations or improvements of older methods of McGillem and Rappaport, or Cohen and Koss. The last one is described by Lukic
et al. [39, 37], but it is not general, as it only works for a subset of all possible beacon locations.
Some newer variations of Geometric/Trigonometric triangulation algorithms are also described in the literature. In
[27], Esteves
et al. extend the algorithm presented by Cohen and Koss
[5] to work for any beacon ordering and to work outside the triangle formed by the three beacons. In
[26, 34], they describe the improved version of their algorithm to handle the remaining special cases (when the robot lies on the line joining two beacons). They also analyze the position and orientation error sensitivity in
[34]. Whereas Easton and Cameron
[2] concentrate on an error model for the three object triangulation problem, they also briefly present an algorithm belonging to this family. Their simple method works in the whole plane and for any beacon ordering. The work of Hmam
[17] is based on Esteves
et al., as well as Cohen and Koss. He presents a method, valid for any beacon ordering, that divides the whole plane into seven regions and handles two specific configurations of the robot relatively to the beacons. In
[8], Madsen and Andersen describe a visionbased positioning system. Such an algorithm belongs to the trigonometric triangulation family as the vision system is used to measure angles between beacons.
It should be noted that the “three object triangulation problem” is also known as the “three point resection problem” in the surveying engineering research area. In this field, the beacons are often called stations, and the angles (or azimuths) are measured with a goniometer. As it is a basic operation for decades in the surveying field, there are lots of procedures (more than
500) to solve this problem, numerically as well as graphically
[31]. Surprisingly, there is almost no link between these two fields, except the recent work of FontLlagunes and Batlle
[31], and the older one of McGillem and Rappaport
[10, 9]. Therefore, it appears that some algorithms from the two fields are similar, but wear different names. One of the most famous and oldest procedures is called the KaestnerBurkhardt method, which is also known as the PothonotSnellius method
[42]. This method is similar to the one described by McGillem and Rappaport
[10, 9], which is a trigonometric approach. Then, there is the Collins method
[42], which is a trigonometric solution, close to the one described by Esteves
et al. [34], or Cohen and Koss
[5]. In addition, there is the Cassini method
[42], similar to the method of Easton and Cameron
[2], both being a trigonometric approach. Finally, there is the Tienstra method
[42, 32], which is a completely different approach, since it makes use of the barycentric coordinates in order to express the robot position as a weighted sum of the beacons’ coordinates. This method has been known for a long time; Porta and Thomas have presented a new concise proof for this method recently
[32]. Despite that the three point resection problem is known for a long time and has many solutions, some newer works are still emerging. For example, FontLlagunes and Batlle
[31] have published a new method, which uses straight lines intersection and the property of similar triangles. To our knowledge, the most recent work has been carried on by Ligas
[38]. Both Ligas’s method and
ToTal rely on the idea of using the radical axis of two circles. However, Ligas intersects one radical axis and one circle, whereas our algorithm intersects the three radical axes of the three pairs of circles. Likewise, Ligas also uses only two trigonometric functions (like our method
ToTal), and as a consequence, it is one of the most efficient methods (with
ToTal), as shown in Section
4.2↓.
Some of the Multiple Beacons Triangulation (multiangulation) algorithms are described hereafter. One of the first work in this field was presented by Avis and Imai
[11]. In their method, the robot measures
k angles from a subset of
n indistinguishable landmarks, and therefore they produce a bounded set a valid placements of the robot. Their algorithm runs in
O(kn^{2}) if the robot has a compass or in
O(kn^{3}) otherwise. The most famous algorithm was introduced by Betke and Gurvits
[36]. They use an efficient representation of landmark 2D locations by complex numbers to compute the robot pose. The landmarks are supposed to have an identifier known by the algorithm. The authors show that the complexity of their algorithm is proportional to the number of beacons. They also performed experiments with noisy angle measurements to validate their algorithm. Finally, they explain how the algorithm deals with outliers. Another interesting approach is proposed by Shimshoni
[22]. He presents an efficient SVD based multiangulation algorithm from noisy angle measurements, and explains why transformations have to be applied to the linear system in order to improve the accuracy of the solution. The solution is very close to the optimal solution computed with nonlinear optimization techniques, while being more than a hundred times faster. Briechle and Hanebeck
[35] present a new localization approach in the case of relative bearing measurements by reformulating the given localization problem as a nonlinear filtering problem.
Siadat and Vialle
[4] describe a multiangulation method based on the NewtonRaphson iterative method to converge to a solution minimizing an evaluation criterion. Lee
et al. [6] present another iterative method very similar to NewtonRaphson. Their algorithm was first designed to work with three beacons, but it can also be generalized to a higher number of beacons. The initial point of the convergence process is set to the center of the beacons, and good results are obtained after only four steps.
Sobreira
et al. [20] present a hybrid triangulation method working with two beacons only. They use a concept similar to the
runningfix method introduced by Bais in
[1], in which the robot has to move by a known distance to create a virtual beacon measurement and to compute the robot pose after it has stopped to take another angle measurement. In
[19], Sobreira
et al. perform an error analysis of their positioning system. In particular, they express the position uncertainty originated by errors on measured angles in terms of a surface. Sanchiz
et al. [33] describe another multiangulation method based on
Iterative Search and circular correlation. They first compute the robot orientation by maximizing the circular correlation between the expected beacons angles and the measured beacons angles. Then, a method similar to
Iterative Search is applied to compute the position. Hu and Gu
[18] present a multiangulation method based on Kohonen neural network to compute the robot pose and to initialize an extended Kalman filter used for navigation.
2.2 Brief discussion
It is difficult to compare all the above mentioned algorithms, because they operate in different conditions and have distinct behaviors. In practice, the choice is dictated by the application requirements and some compromises. For example, if the setup contains three beacons only or if the robot has limited onboard processing capabilities, methods of the first and second groups are the best candidates. Methods of the third and fourth groups are appropriate if the application must handle multiple beacons and if it can accommodate a higher computational cost. The main drawback of the third group is the convergence issue (existence or uniqueness of the solution)
[5]. The main drawback of the fourth group is the computational cost
[36, 35, 22].
The drawbacks of the first and second group are usually a lack of precision related to the following elements: (1) the beacon ordering needed to get the correct solution, (2) the consistency of the methods when the robot is located outside the triangle defined by the three beacons, (3) the strategy to follow when falling into some particular geometrical cases (that induce mathematical underdeterminations when computing trigonometric functions with arguments like
0 or
π, division by
0, etc), and (4) the reliability measure of the computed position. Simple methods of the first and second groups usually fail to propose a proper answer to all of these concerns. For example, to work in the entire plane and for any beacon ordering (for instance
[34]), they have to consider a set of special geometrical cases separately, that results in a lack of clarity. Finally, to our knowledge, none of these algorithms gives a realistic reliability measure of the computed position.
2.3 Other aspects of triangulation
For now, we have focused on the description of triangulation algorithms which are used to compute the position and orientation of the robot. Other aspects of triangulation have to be considered as well to achieve an optimal result on the robot pose in a practical situation. These are: (1) the sensitivity analysis of the triangulation algorithm, (2) the optimal placement of the landmarks, (3) the selection of some landmarks among the available ones to compute the robot pose, and (4) the knowledge of the true landmark locations in the world and the true location of the angular sensor on the robot.
Kelly
[3] uses the famous Geometric Dilution of Precision (GDOP) concept, used in GPS error analysis, and applies it to range based and angle based positioning techniques. He derives two useful formulas in the case of two beacons. Madsen
et al. [7] perform a sensitivity analysis of their triangulation algorithm. They use first order propagation of angle measurement errors through covariance matrix and Jacobian to derive the precision of location. Easton and Cameron
[2] present the same error analysis as that of Madsen
et al., but in addition to the angle uncertainty, they take into account the beacon location uncertainty. They also present some simulations for various beacons configurations as well as some metrics to evaluate their model’s performance.
Optimal landmark placement has been extensively studied. Sinriech and Shoval
[13, 43] define a nonlinear optimization model used to determine the position of the minimum number of beacons required by a shop floor to guarantee an accurate and reliable performance for automated guided vehicles (AGVs). Demaine
et al. [14] present a polynomial time algorithm to place reflector landmarks such that the robot can always localize itself from any position in the environment, which is represented by a polygonal map. Tedkas and Isler
[40, 41] address the problem of computing the minimum number and placement of sensors so that the localization uncertainty at every point in the workspace is less than a given threshold. They use the uncertainty model for angle based positioning derived by Kelly
[3].
Optimal landmark selection has been studied by Madsen
et al. [7, 8]. They propose an algorithm to select the best triplet among several landmarks seen by a camera, yielding to the best position estimate. The algorithm is based on a “goodness” measure derived from an error analysis which depends on landmarks and on the robot relative pose.
Having a good sensor that provides precise angle measurements as well as a good triangulation algorithm is not the only concern to get accurate positioning results. Indeed, the angle sensor could be subject to non linearities in the measuring angle range (a complete revolution). Moreover, the beacon locations that are generally measured manually are subject to inaccuracies, which directly affect the positioning algorithm. In their paper, Loevsky and Shimshoni
[21] propose a method to calibrate the sensor and a method to correct the measured beacon locations. They show that their procedure is effective and mandatory to achieve good positioning performance.
In the remainder of this paper, we concentrate on three object triangulation methods. Our paper presents a new three object triangulation algorithm that works in the entire plane (except when the beacons and the robot are concyclic or collinear), and for any beacon ordering. Moreover, it minimizes the number of trigonometric computations, and provides a unique quantitative reliability measure of the computed position.
3 Description of a New Three Object Triangulation Algorithm
Our motivation for a new triangulation algorithm is fourfold: (1) we want it to be independent of the beacon ordering, (2) the algorithm must also be independent of the relative positions of the robot and the beacons, (3) the algorithm must be fast and simple to implement in a dedicated processor, and (4) the algorithm has to provide a criterion to qualify the reliability of the computed position.
Our algorithm, named ToTal, belongs to the family of Geometric Circle Intersection algorithms (that is, the second group). It first computes the parameters of the three circles passing through the robot and the three pairs of beacons. Then, it computes the intersection of these three circles, by using all the three circles parameters (not only two of them, to the contrary of other methods).
Our algorithm relies on two assumptions: (1) the beacons are distinguishable (a measured angle can be associated to a given beacon), and (2) the angle measurements from the beacons are taken separately, and relatively to some reference angle
θ, usually the robot heading (see Figure
1↑). Note that the second hypothesis simply states that angles are given by a rotating angular sensor. Such sensors are common in mobile robot positioning using triangulation
[24, 23, 16, 6, 9, 48]. By convention, in the following, we consider that angles are measured counterclockwise (CCW), like angles on the trigonometric circle. Inverting the rotating direction to clockwise (CW) would only require minimal changes of our algorithm.
3.1 First part of the algorithm: the circle parameters
In a first step, we want to calculate the locus of the robot positions
R, that
see two fixed beacons,
B_{1} and
B_{2}, with a constant angle
γ, in the 2D plane. It is a wellknown result that this locus is an arc of the circle passing through
B_{1} and
B_{2}, whose radius depends on the distance between
B_{1} and
B_{2}, and
γ (Proposition 21 of Book III of
Euclid’s Elements
[44]). More precisely, this locus is composed of two arcs of circle, which are the reflection of each other through the line joining
B_{1} and
B_{2} (see the continuous lines of the lefthand side drawing of Figure
2↓).
A robot that measures an angle
γ between two beacons can stand on either of these two arcs. This case occurs if the beacons are not distinguishable or if the angular sensor is not capable to measure angles larger than
π (like a vision system with a limited field of view, as used by Madsen
et al. [8]). To avoid this ambiguity, we impose that, as shown in the righthand side of Figure
2↑, the measured angle between two beacons
B_{1} and
B_{2}, which is denoted
φ_{12}, is always computed as
φ_{12} = φ_{2} − φ_{1} (this choice is natural for a CCW rotating sensor). This is consistent with our measurement considerations and it removes the ambiguity about the locus; however, it requires that beacons are indexed and that the robot is capable to establish the index of any beacon. As a result, the locus is a single circle passing through
R,
B_{1}, and
B_{2}. In addition, the line joining
B_{1} and
B_{2} divides the circle into two parts: one for
φ_{12} < π and the other for
φ_{12} > π. In the following, we compute the circle parameters.
The circle equation may be derived by using the complex representation of 2D points (
Argand diagram), and by expressing angles as the argument of complex numbers. In particular, the angle of
(B_{2} − R) is equal to that of
(B_{1} − R) plus
φ_{12}. Equivalently
Then, if we substitute
R,
B_{1},
B_{2}, respectively by
(x + iy),
(x_{1} + iy_{1}),
(x_{2} + iy_{2}), we have that
where
i = √( − 1). After lengthy simplifications, we find the locus
which is a circle whose center
{x_{12}, y_{12}} is located at
and whose squared radius equals
The three last equations may also be found in
[29]. The replacement of
φ_{12} by
π + φ_{12} in the above equations yields the same circle parameters (see the righthand side of Figure
2↑), which is consistent with our measurement considerations. For an angular sensor turning in the CW direction, these equations are identical except that one has to change the sign of
cot(.) in equations
5↑ and
5↑.
Hereafter, we use the following notations:

B_{i} is the beacon i, whose coordinates are {x_{i}, y_{i}},

R is the robot position, whose coordinates are {x_{R}, y_{R}},

φ_{i} is the angle for beacon B_{i},

φ_{ij} = φ_{j} − φ_{i} is the bearing angle between beacons B_{i} and B_{j},

T_{ij} = cot(φ_{ij}),

C_{ij} is the circle passing through B_{i}, B_{j}, and R,

c_{ij} is the center of C_{ij}, whose coordinates are {x_{ij}, y_{ij}}:

R_{ij} is the radius of C_{ij}, derived from:
All the previous quantities are valid for i ≠ j; otherwise the circle does not exist. In addition, we have to consider the case φ_{ij} = kπ, k ∈ ℤ. In that case, the sin(.) and cot(.) are equal to zero, and the circle degenerates as the B_{i}B_{j} line (infinite radius and center coordinates). In a practical situation, it means that the robot stands on the B_{i}B_{j} line, and measures an angle φ_{ij} = π when being between the two beacons, or φ_{ij} = 0 when being outside of the B_{i}B_{j} segment. These special cases are discussed later.
3.2 Second part of the algorithm: circles intersection
From the previous section, each bearing angle
φ_{ij} between beacons
B_{i} and
B_{j} constraints the robot to be on a circle
C_{ij}, that passes through
B_{i},
B_{j}, and
R (Figure
3↓). The parameters of the circles are given by equations
8↑,
8↑, and
10↑. Common methods use two of the three circles to compute the intersections (when they exist), one of which is the robot position, the second being the common beacon of the two circles. This requires to solve a quadratic system and to choose the correct solution for the robot position
[29]. Moreover the choice of the two circles is arbitrary and usually fixed, whereas this choice should depend on the measured angles or beacons and robot relative configuration to have a better numerical behavior.
Hereafter, we propose a novel method to compute this intersection, by using all of the three circles, and by reducing the problem to a linear problem. To understand this elegant method, we first introduce the notion of
power center (or
radical center) of three circles. The
power center of three circles is the unique point of equal
power with respect to these circles
[28]. The
power of a point
p relative to a circle
C is defined as
where
{x, y} are the coordinates of
p,
{x_{c}, y_{c}} are the circle center coordinates, and
R is the circle radius. The power of a point is null onto the circle, negative inside the circle, and positive outside the circle. It defines a sort of relative distance of a point from a given circle. The
power line (or
radical axis) of two circles is the locus of points having the same power with respect to both circles
[28]; in other terms, it is also the locus of points at which tangents drawn to both circles have the same length. The power line is perpendicular to the line joining the circle centers and passes through the circle intersections, when they exist. Monge demonstrated that when considering three circles, the three power lines defined by the three pairs of circles are concurring in the power center
[28]. Figure
4↓ shows the power center of three circles for various configurations. The power center is always defined, except when at least two of the three circle centers are equal, or when the circle centers are collinear (parallel power lines).
The third case of Figure
4↑ (righthand drawing) is particular as it perfectly matches our triangulation problem (Figure
3↑). Indeed, the power center of three concurring circles corresponds to their unique intersection. In our case, we are sure that the circles are concurring since we have
by construction (only two of the three bearing angles are independent), even in presence of noisy angle measurements
φ_{1},
φ_{2}, and
φ_{3}. It has the advantage that this intersection may be computed by intersecting the power lines, which is a linear problem. The power line of two circles is obtained by equating the power of the points relatively to each circle [given by equation
11↑]. In our problem, the power line of
C_{12} and
C_{23} is given by:
where we introduce a new quantity
k_{ij} which only depends on
C_{ij} parameters:
In our triangulation problem, we have to intersect the three power lines, that is to solve this linear system:
As can be seen, any of these equations may be obtained by adding the two others, which is a way to prove that the three power lines concur in a unique point: the power center. The coordinates of the power center, that is the robot position is given by
The denominator
D_{△}, which is common to
x_{R} and
y_{R}, is equal to:
which is the signed area between the circle centers, multiplied by two. This result shows that the power center exists, if the circle centers are not collinear, that is if
D_{△} ≠ 0. The special case (
D_{△} = 0) is discussed later.
3.3 First (naive) version of the algorithm
A first, but naive, version of our algorithm consists in applying the previous equations to get the robot position. This method is correct; however, it is possible to further simplify the equations. First, note that the squared radii
R^{2}_{ij} only appear in the parameters
k_{ij}. If we replace the expression of
R^{2}_{ij} [equation
10↑] in the expression of
k_{ij} [equation
16↑], we find, after many simplifications that
which is much simpler than equations
10↑ and
16↑ (no squared terms anymore). In addition, the
^{1}⁄_{2} factor involved in the circle centers coordinates [equations
8↑ and
8↑] as well as in the parameters
k_{ij} [equation
16↑], cancels in the robot position coordinates [see equations
18↑ and
18↑]. This factor can thus be omitted. For now, we use these modified circle center coordinates
{x’_{ij}, y’_{ij}}
and modified parameters
k’_{ij}
3.4 Final version of the algorithm
The most important simplification consists in translating the world coordinate frame into one of the beacons, that is solving the problem relatively to one beacon and then add the beacon coordinates to the computed robot position (like FontLlagunes
[29] without the rotation of the frame). In the following, we arbitrarily choose
B_{2} as the origin (
B’_{2} = {0, 0}). The other beacon coordinates become:
B’_{1} = {x_{1} − x_{2}, y_{1} − y_{2}} = {x’_{1}, y’_{1}} and
B’_{3} = {x_{3} − x_{2}, y_{3} − y_{2}} = {x’_{3}, y’_{3}}. Since
x’_{2} = 0 and
y’_{2} = 0, we have
k’_{12} = 0,
k’_{23} = 0. In addition, we can compute the value of one
cot(.) by referring to the two other
cot(.) because the three angles are linked [equation
12↑]:
The final algorithm is given in Algorithm
2↓.
Given the three beacon coordinates
{x_{1}, y_{1}},
{x_{2}, y_{2}},
{x_{3}, y_{3}}, and the three angles
φ_{1},
φ_{2},
φ_{3}:

compute the modified beacon coordinates:
x’_{1} = x_{1} − x_{2}, y’_{1} = y_{1} − y_{2},
x’_{3} = x_{3} − x_{2}, y’_{3} = y_{3} − y_{2},

compute the three cot(.):
T_{12} = cot(φ_{2} − φ_{1}), T_{23} = cot(φ_{3} − φ_{2}),
T_{31} = (1 − T_{12}T_{23})/(T_{12} + T_{23}),

compute the modified circle center coordinates:
x’_{12} = x’_{1} + T_{12} y’_{1}, y’_{12} = y’_{1} − T_{12} x’_{1},
x’_{23} = x’_{3} − T_{23} y’_{3}, y’_{23} = y’_{3} + T_{23} x’_{3},
x’_{31} = (x’_{3} + x’_{1}) + T_{31} (y’_{3} − y’_{1}),
y’_{31} = (y’_{3} + y’_{1}) − T_{31} (x’_{3} − x’_{1}),

compute k’_{31}:
k’_{31} = x’_{1}x’_{3} + y’_{1}y’_{3} + T_{31}(x’_{1}y’_{3} − x’_{3}y’_{1}),

compute D (if D = 0, return with an error):
D = (x’_{12} − x’_{23})(y’_{23} − y’_{31}) − (y’_{12} − y’_{23})(x’_{23} − x’_{31}),

compute the robot position {x_{R}, y_{R}} and return:
x_{R} = x_{2} + (k’_{31}(y’_{12} − y’_{23}))/(D), y_{R} = y_{2} + (k’_{31}(x’_{23} − x’_{12}))/(D).
Algorithm 2 Final version of the ToTal algorithm.
3.5 Discussion
The ToTal algorithm is very simple: computations are limited to basic arithmetic operations and only two cot(.). Furthermore, the number of conditional statements is reduced, which increases its readability and eases its implementation. Among them, we have to take care of the cot(.) infinite values and the division by D, if equal to zero. If a bearing angle φ_{ij} between two beacons is equal to 0 or π, that is, if the robot stands on the B_{i}B_{j} line, then cot(φ_{ij}) is infinite. The corresponding circle degenerates to the B_{i}B_{j} line (infinite radius and center coordinates). The robot is then located at the intersection of the remaining power line and the B_{i}B_{j} line; it can be shown that the mathematical limit lim_{Tij → ±∞}{x_{R}, y_{R}} exists and corresponds to this situation.
Like for other algorithms, our algorithm also has to deal with these special cases, but the way to handle them is simple. In practice, we have to avoid
Inf or
NaN values in the floating point computations. We propose two ways to manage this situation. The first way consists in limiting the
cot(.) value to a minimum or maximum value, corresponding to a small angle that is far below the measurement precision. For instance, we limit the value of the
cot(.) to
±10^{8}, which corresponds to an angle of about
±10^{ − 8} rad; this is indeed far below the existing angular sensor precisions. With this approximation of the mathematical limit, the algorithm remains unchanged. The second way consists in adapting the algorithm when one bearing angle is equal to
0 or
π. This special case is detailed in Algorithm
3↓, in which the indexes
{i, j, k} have to be replaced by
{1, 2, 3},
{3, 1, 2}, or
{2, 3, 1} if
φ_{31} = 0,
φ_{23} = 0, or
φ_{12} = 0 respectively.
Given the three beacon coordinates
{x_{i}, y_{i}},
{x_{j}, y_{j}},
{x_{k}, y_{k}}, and the three angles
φ_{i},
φ_{j},
φ_{k}:

compute the modified beacon coordinates:
x’_{i} = x_{i} − x_{j}, y’_{i} = y_{i} − y_{j},
x’_{k} = x_{k} − x_{j}, y’_{k} = y_{k} − y_{j},

compute T_{ij} = cot(φ_{j} − φ_{i}),

compute the modified circle center coordinates:
x’_{ij} = x’_{i} + T_{ij} y’_{i}, y’_{ij} = y’_{i} − T_{ij} x’_{i},
x’_{jk} = x’_{k} + T_{ij} y’_{k}, y’_{jk} = y’_{k} − T_{ij} x’_{k},
x’_{ki} = (y’_{k} − y’_{i}), y’_{ki} = (x’_{i} − x’_{k}),

compute k’_{ki} = (x’_{i}y’_{k} − x’_{k}y’_{i}),

compute D (if D = 0, return with an error):
D = (x’_{jk} − x’_{ij})(y’_{ki}) + (y’_{ij} − y’_{jk})(x’_{ki}),

compute the robot position {x_{R}, y_{R}} and return:
x_{R} = x_{j} + (k’_{ki}(y’_{ij} − y’_{jk}))/(D), y_{R} = y_{j} + (k’_{ki}(x’_{jk} − x’_{ij}))/(D).
Algorithm 3 Special case φ_{ki} = 0 ∨ φ_{ki} = π.
The denominator
D is equal to
0 when the circle centers are collinear or coincide. For non collinear beacons, this situation occurs when the beacons and the robot are concyclic; they all stand on the same circle, which is called the
critic circumference in
[29]. In that case, the three circles are equal as well as their centers, which causes
D = 0 (the area defined by the three circle centers is equal to zero). For collinear beacons, this situation is encountered when the beacons and the robot all stand on this line. For these cases, it is impossible to compute the robot position. This is a
restriction common to all three object triangulation, regardless of the used algorithm
[27, 29, 9].
The value of
D, computed in the final algorithm, is the signed area delimited by the real circle centers, multiplied by height.
D decreases to
0 when the robot approaches the critic circle (almost collinear circle center and almost parallel power lines). Therefore, it is quite natural to use
D as a reliability measure of the computed position. In the next section, we show that
^{1}⁄_{D} is a good approximation of the position error. In practice, this measure can be used as a validation gate after the triangulation algorithm, or when a Kalman filter is used. Finally, it should be noted that the robot orientation
θ_{R} may be determined by using any beacon
B_{i} and its corresponding angle
φ_{i}, once the robot position is known:
where
atan2(y, x) denotes the Clike two arguments function, defined as the principal value of the argument of the complex number
(x + iy).
4 Simulations
4.1 Error Analysis
The problem of triangulation given three angle measurements is an exact calculus of the robot pose, even if these angles are affected by noise. Therefore, the sensitivity of triangulation with respect to the input angles is unique and does not depend on the way the problem is solved, nor on the algorithm. This contrasts with multiangulation, which is an overdetermined problem, even with perfect angle measurements. Therefore, we do not elaborate on the error analysis for triangulation, as it has been studied in many papers; the same conclusions, as found in
[2, 34, 29, 3, 8, 43], also yield for our algorithm. However, in order to validate our algorithm and to discuss the main characteristics of triangulation sensitivity, we have performed some simulations. The simulation setup comprises a square shaped area (
4 × 4 m^{2}), with three beacons forming two distinct configurations. The first one is a regular triangle (
B_{1} = {0 m, 1 m},
B_{2} = { − 0.866 m, − 0.5 m}, and
B_{3} = {0.866 m, − 0.5 m}), and the second one is a line (
B_{1} = {0 m, 0 m},
B_{2} = { − 0.866 m, 0 m}, and
B_{3} = {0.866 m, 0 m}). The distance step is
2 cm in each direction. For each point in this grid, we compute the exact angles
φ_{i} seen by the robot (the robot orientation is arbitrarily set to
0 °). Then we add Gaussian noise to these angles, with zero mean, and with two different standard deviations (
σ = 0.01 °,
σ = 0.1 °). The noisy angles are then used as inputs of our algorithm to compute the estimated position. The position error
Δd_{R} is the Euclidean distance between the exact and estimated positions:
The orientation error
Δθ_{R} is the difference between the exact and estimated orientations:
The experiment is repeated
1000 times for each position to compute the standard deviation of the position error
√(var{Δd_{R}}) and of the orientation error
√(var{Δθ_{R}}). The standard deviations of the position and orientation errors are drawn in Figure
5↓. The beacon locations are represented by black and white dot patterns. The first and second columns provide the result for the first configuration, for
σ = 0.01 °, and
σ = 0.1 °, respectively. The third and fourth columns provide the result for the second configuration, for
σ = 0.01 °, and
σ = 0.1 °, respectively. The first, second, and third rows show the standard deviation of the position error, the standard deviation of the orientation error, and the mean error measure
^{1}⁄_{D}, respectively. Note that the graphic scales are not linear. We have equalized the image histograms in order to enhance their visual representation, and to point out the similarities between the position and orientation error, and our new error measure.
Our simulation results are consistent with common three object triangulation algorithms. In particular, in the first configuration, we can easily spot the critic
circumference where errors are large, the error being minimum at the center of this circumference. In the second configuration, this critic circumference degenerates as the line passing through the beacons. In addition, one can see that, outside the critic circumference, the error increases with the distance to the beacons. It is also interesting to note that
^{1}⁄_{D} has a similar shape than the position or orientation errors (except in the particular case of collinear beacons). It can be proven [starting from equations
18↑ and
18↑], by a detailed sensitivity analysis of the robot position error with respect to angles, that
where
Δφ is the angle error (assumed to be the same for the three angles), and
f(.) is some function of all the other parameters (see the appendix for details). This confirms our claim that
^{1}⁄_{D} can be used as an approximation of the position error. Furthermore, one can observe from the graphic scales, that the position or orientation errors almost evolve linearly with angle errors, when they are small (look at the scale of the different graphics). Note that there is a small discrepancy in the symmetry of the simulated orientation error with respect to the expected behavior. This is explained because we used
B_{1} to compute the orientation (see equation
26↑). In addition, the histogram equalization emphasizes this small discrepancy.
4.2 Benchmarks

Algorithm

+

×

⁄

√(x)

trigo

time (s) ^{†}

[46, 47]

ToTal ^{1}

30

17

2

0

2

0.163

[38]

Ligas ^{1}

29

22

2

0

2

0.171

[29]

FontLlagunes ^{1}

23

17

2

0

5

0.223

[42]

Cassini ^{2}

19

8

3

0

4

0.249

[5]

Cohen ^{1}

37

15

3

2

4

0.272

[2]

Easton ^{2}

22

24

1

0

5

0.298

[9]

McGillem ^{1}

37

18

5

2

8

0.340

[17]

Hmam ^{2}

29

11

3

3

9

0.428

[5]

Cohen ^{2}

26

11

3

2

11

0.437

[34]

Esteves ^{2}

43

14

2

2

11

0.471

[42]

Collins ^{2}

34

10

2

2

11

0.485

[9]

McGillem ^{2}

29

9

3

2

11

0.501

[42]

Kaestner ^{2}

28

10

3

2

11

0.504

[45]

Tsukiyama ^{1}

52

22

3

5

14

0.596

[16]

Casanova ^{1}

52

21

4

5

14

0.609

[32]

Tienstra ^{2}

33

18

8

3

9

0.640

[31]

FontLlagunes ^{1}

62

25

6

1

8

0.648

[8]

Madsen ^{2}

38

24

5

3

15

0.707

^{†} For
10^{6} executions on an Intel(R) Core(TM) i7 920 @ 2.67GHz.
^{1} Geometric circle intersection ^{2} Trigonometric solution
Table 1 Comparison of various triangulation algorithms to our ToTal algorithm.
We have also compared the execution time of our algorithm to 17 other three object triangulation algorithms similar to ours (
i. e. which work in the whole plane and for any beacon ordering). These algorithms have been introduced in Section
2↑, and have been implemented after the author’s guidelines. Each algorithm has been running
10^{6} times at random locations of the same square shaped area as that used for the error analysis. The last column of Table
1↑ provides the running times on an
Intel(R) Core(TM) i7 920 @ 2.67GHz (6GB RAM, Ubuntu 11.04, GCC 4.5.2). We used the C
clock_gettime function to measure the execution times, in order to yield reliable results under timesharing. It appears that our algorithm is the fastest of all (about
30 % faster than the last best known algorithm of FontLlagunes
[29], and
5 % faster than the recent algorithm of Ligas
[38]). In addition to the computation times, we have also reported the number of basic arithmetic computations, squared roots, and trigonometric functions used for each algorithm. This may help to choose an algorithm for a particular hardware architecture, which may have a different behavior for basic arithmetic computations, or complex functions such as square root or trigonometric functions. One can see that our algorithm has the minimum number of trigonometric functions, which is clearly related to the times on a classical computer architecture (see Table
1↑). A fast algorithm is an advantage for error simulations, beacon placement, and beacon position optimization algorithms. Note that the algorithm of Ligas also uses the minimum number of trigonometric functions (two
cot(.) computations) like
ToTal, explaining why both algorithms are basically similar in terms of efficiency. However, the algorithm of Ligas does not provide a reliability measure, contrarily to our algorithm
ToTal.
5 Conclusions
Most of the many triangulation algorithms proposed so far have major limitations. For example, some of them need a particular beacon ordering, have blind spots, or only work within the triangle defined by the three beacons. More reliable methods exist, but they have an increasing complexity or they require to handle certain spatial arrangements separately.
This paper presents a new three object triangulation algorithm based on the elegant notion of power center of three circles. Our new triangulation algorithm, which is called ToTal, natively works in the whole plane (except when the beacons and the robot are concyclic or collinear), and for any beacon ordering. Furthermore, it only uses basic arithmetic computations and two cot(.) computations. Comprehensive benchmarks show that our algorithm is faster than comparable algorithms, and simpler in terms of the number of operations. In this paper, we have compared the number of basic arithmetic computations, squared root, and trigonometric functions used for 17 known triangulation algorithms.
In addition, we have proposed a unique reliability measure of the triangulation result in the whole plane, and established by simulations that ^{1}⁄_{D} is a natural and adequate criterion to estimate the error of the positioning. To our knowledge, none of the algorithms of the same family does provide such a measure. This error measure can be used to identify the pathological cases (critic circumference), or as a validation gate in Kalman filters based on triangulation.
For all these reasons, ToTal is a fast, flexible, and reliable three object triangulation algorithm. Such an algorithm is an excellent choice for many triangulation issues related to the performance or optimization, such as error simulations, beacon placement or beacon position optimization algorithms. It can also be used to understand the sensitivity of triangulation with respect to the the input angles. A fast and inexpensive algorithm is also an asset to initialize a positioning algorithm that internally relies on a Kalman filter.
Appendix
In this section, we detail the sensitivity analysis of the computed position. We start by computing the derivative of
x_{R} with respect to the first angle
φ_{1}
where
g_{1}(.) is some function of all the other parameters. Similar results yield for the derivative of
x_{R} with respect to the second and third angles,
φ_{2} and
φ_{3} respectively
where
g_{2}(.) and
g_{3}(.) are some functions of all the other parameters. The total differential of
x_{R} with respect to
φ_{1},
φ_{2}, and
φ_{3} is given by
where we assumed that the three infinitesimal increments are equal
Δφ = Δφ_{1} = Δφ_{2} = Δφ_{3}. A similar result yields for the total differential of
y_{R}
where
h(.) is some function of all the other parameters. Finally, we can compute
Δd_{R} as follows
where
f(.) is some function of all the other parameters.
References
[1] A. Bais, R. Sablatnig, J. Gu. Single landmark based selflocalization of mobile robots. Third Canadian Conference on Computer and Robot Vision (CRV), 2006.
[2] A. Easton, S. Cameron. A Gaussian Error Model for TriangulationBased Pose Estimation Using Noisy Landmarks. IEEE Conference on Robotics, Automation and Mechatronics:16, 2006. URL http://dx.doi.org/10.1109/RAMECH.2006.252663.
[3] A. Kelly. Precision Dilution in Triangulation Based Mobile Robot Position Estimation. Intelligent Autonomous Systems, 8:10461053, 2003.
[4] A. Siadat, S. Vialle. Robot Localization, Using PSimilar Landmarks, Optimized Triangulation and Parallel Programming. IEEE International Symposium on Signal Processing and Information Technology, 2002.
[5] C. Cohen, F. Koss. A Comprehensive Study of Three Object Triangulation. Mobile Robots VII, 1831:95106, 1992. URL http://dx.doi.org/10.1117/12.143782.
[6] C. Lee, Y. Chang, G. Park, J. Ryu, S.G. Jeong, S. Park, J.W. Park, H.C. Lee, K.S. Hong, M.H. Lee. Indoor positioning system based on incident angles of infrared emitters. Conference of the IEEE Industrial Electronics Society (IECON), 3:22182222, 2004.
[7] C.B. Madsen, C.S. Andersen, J.S. Sorensen. A robustness analysis of triangulationbased robot selfpositioning. International Symposium on Intelligent Robotic Systems:195204, 1997.
[8] C.B. Madsen, C.S. Andersen. Optimal landmark selection for triangulation of robot position. Robotics and Autonomous Systems, 23(4):277292, 1998. URL http://dx.doi.org/10.1016/S09218890(98)000141.
[9] C.D. McGillem, T.S. Rappaport. A Beacon Navigation Method for Autonomous Vehicles. IEEE Transactions on Vehicular Technology, 38(3):132139, 1989. URL http://dx.doi.org/10.1109/25.45466.
[10] C.D. McGillem, T.S. Rappaport. Infrared location system for navigation of autonomous vehicles. IEEE International Conference on Robotics and Automation (ICRA), 2:12361238, 1988. URL http://dx.doi.org/10.1109/ROBOT.1988.12230.
[11] D. Avis, H. Imai. Locating a robot with angle measurements. Journal of Symbolic Computation, 10(34):311326, 1990.
[12] D. Kortenkamp. Perception for mobile robot navigation: A survey of the state of the art. 1994.
[13] D. Sinriech, S. Shoval. Landmark configuration for absolute positioning of autonomous vehicles. IIE Transactions, 32(7):613624, 2000.
[14] E. Demaine, A. LopezOrtiz, J. Munro. Robot Localization without Depth Perception. Proceedings of Scandinavian Workshop on Algorithm Theory (SWAT), 2368:177194, 2002.
[15] E. Zalama, S. Dominguez, J. Gomez, J.R. Peran. Microcontroller based system for 2D localisation. Mechatronics, 15(9):11091126, 2005. URL http://dx.doi.org/10.1016/j.mechatronics.2005.05.001.
[16] E.Z. Casanova, S.D. Quijada, J.G. GarcíaBermejo, J.R.P. González. A New Beaconbased System for the Localization of Moving Objects. IEEE International Conference on Mechatronics and Machine Vision in Practice, 2002. URL http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.7.1615.
[17] H. Hmam. Mobile Platform SelfLocalization. Information, Decision and Control:242247, 2007. URL http://dx.doi.org/10.1109/IDC.2007.374557.
[18] H. Hu, D. Gu. Landmarkbased Navigation of Industrial Mobile Robots. International Journal of Industry Robot, 27(6):458467, 2000. URL http://dx.doi.org/10.1108/01439910010378879.
[19] H. Sobreira, A.P. Moreira, J. Esteves. Characterization of Position and Orientation Measurement Uncertainties in a LowCost Mobile Platform. Portuguese Conference on Automatic Control (CONTROLO):635640, 2010.
[20] H. Sobreira, A.P. Moreira, J. Esteves. Low cost selflocalization system with two beacons. International Conference on Mobile Robots and Competitions (ROBOTICA):7377, 2010.
[21] I. Loevsky, I. Shimshoni. Reliable and efficient landmarkbased localization for mobile robots. Robotics and Autonomous Systems, 58(5):520528, 2010. URL http://dx.doi.org/10.1016/j.robot.2010.01.006.
[22] I. Shimshoni. On Mobile Robot Localization From Landmark Bearings. IEEE Transactions on Robotics and Automation, 18(6):971976, 2002. URL http://dx.doi.org/10.1109/TRA.2002.805663.
[23] J. Borenstein, H. Everett, L. Feng, D. Wehe. Mobile Robot Positioning  Sensors and Techniques. Journal of Robotic Systems, 14(4):231249, 1997.
[24] J. Borenstein, H. Everett, L. Feng. Where am I? Systems and Methods for Mobile Robot Positioning. 1996.
[25] J. Borenstein, L. Feng. UMBmark: A Benchmark Test for Measuring Odometry Errors in Mobile Robots. Society of PhotoInstrumentation Engineers, 1001:113124, 1995. URL http://dx.doi.org/10.1117/12.228968.
[26] J. Esteves, A. Carvalho, C. Couto. An improved version of the Generalized geometric triangulation algorithm. EuropeanLatinAmerican Workshop on Engineering Systems, 2006.
[27] J. Esteves, A. Carvalho, C. Couto. Generalized Geometric Triangulation Algorithm for Mobile Robot Absolute SelfLocalization. IEEE International Symposium on Industrial Electronics (ISIE), 1:346351, 2003.
[28] J.D. Eiden. Geometrie analytique classique. Calvage & Mounet, 2009.
[29] J.M. FontLlagunes, J.A. Batlle. Consistent triangulation for mobile robot localization using discontinuous angular measurements. Robotics and Autonomous Systems, 57(9):931942, 2009. URL http://dx.doi.org/10.1016/j.robot.2009.06.001.
[30] J.M. FontLlagunes, J.A. Batlle. Mobile Robot Localization. Revisiting the Triangulation Methods. International IFAC Symposium on Robot Control, 8:340345, 2006.
[31] J.M. FontLlagunes, J.A. Batlle. New method that solves the threepoint resection problem using straight lines intersection. Journal of Surveying Engineering, 135(2):3945, 2009. URL http://dx.doi.org/10.1061/(ASCE)07339453(2009)135:2(39).
[32] J.M. Porta, F. Thomas. Simple Solution to the Three Point Resection Problem. Journal of Surveying Engineering, 135(4):170172, 2009. URL http://dx.doi.org/10.1061/(ASCE)07339453(2009)135:4(170).
[33] J.M. Sanchiz, J. Badenas, F. Pla. Control system and laserbased sensor design of an automonous vehicle for industrial environments. Society of PhotoInstrumentation Engineers, 5422(1):608615, 2004. URL http://dx.doi.org/10.1117/12.565046.
[34] J.S. Esteves, A. Carvalho, C. Couto. Position and Orientation Errors in Mobile Robot Absolute SelfLocalization Using an Improved Version of the Generalized Geometric Triangulation Algorithm. IEEE International Conference on Industrial Technology (ICIT):830835, 2006. URL http://dx.doi.org/10.1109/ICIT.2006.372345.
[35] K. Briechle, U. Hanebeck. Localization of a Mobile Robot Using Relative Bearing Measurements. IEEE Transactions on Robotics and Automation, 20(1):3644, 2004.
[36] M. Betke, L. Gurvits. Mobile Robot Localization Using Landmarks. IEEE Transactions on Robotics and Automation, 13(2):251263, 1997. URL http://dx.doi.org/10.1109/70.563647.
[37] M. Brkic, M. Lukic, J. Bajic, B. Dakic, M. Vukadinovic. Hardware realization of autonomous robot localization system. International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO):146150, 2012.
[38] M. Ligas. Simple Solution to the Three Point Resection Problem. Journal of Surveying Engineering, 139(3):120125, 2013. URL http://dx.doi.org/10.1061/(ASCE)SU.19435428.0000104.
[39] M. Lukic, B. Miodrag, J. Bajic. An Autonomous Robot Localization System Based on Coded Infrared Beacons. Research and Education in Robotics  EUROBOT 2011, 161:202209, 2011. URL http://dx.doi.org/10.1007/9783642219757_18.
[40] O. Tekdas, V. Isler. Sensor Placement Algorithms for Triangulation Based Localization. IEEE International Conference on Robotics and Automation (ICRA):44484453, 2007. URL http://dx.doi.org/10.1109/ROBOT.2007.364164.
[41] O. Tekdas, V. Isler. Sensor Placement for TriangulationBased Localization. IEEE Transactions on Automation Science and Engineering, 7(3):681685, 2010. URL http://dx.doi.org/10.1109/TASE.2009.2037135.
[42] R. Burtch. Three point resection problem. Surveying Engineering Department, Ferris State University, 2005.
[43] S. Shoval, D. Sinriech. Analysis of landmark configuration for absolute positioning of autonomous vehicles. Journal of Manufacturing Systems, 20(1):4454, 2001.
[44] T. Heath. Euclid: the thirteen books of The Elements. Dover Publications, 1956.
[45] T. Tsukiyama. Mobile Robot Localization from Landmark Bearings. World Congress on Fundamental and Applied Metrology:21092112, 2009. URL http://www.imeko.org/publications/wc2009/IMEKOWC2009TC17084.pdf.
[46] V. Pierlot, M. Van Droogenbroeck, M. UrbinChoffray. A new three object triangulation algorithm based on the power center of three circles. Research and Education in Robotics (EUROBOT), 161:248262, 2011. URL http://hdl.handle.net/2268/89435.
[47] V. Pierlot, M. Van Droogenbroeck. A New Three Object Triangulation Algorithm for Mobile Robot Positioning. IEEE Transactions on Robotics, 30(3):566577, 2014. URL http://dx.doi.org/10.1109/TRO.2013.2294061.
[48] V. Pierlot, M. Van Droogenbroeck. A simple and low cost angle measurement system for mobile robot positioning. Workshop on Circuits, Systems and Signal Processing (ProRISC):251254, 2009. URL http://orbi.ulg.ac.be/handle/2268/30550.