Positioning points in window coordinates in GeoGebra
This tutorial has also been compiled in LaTeX: Download it here
The original idea for this construction was taken from this entry by Michel Horvath. If you want to download a resource where this has been implemented, you can do so here.
This construction below does it randomly.
Introduction
When we want to position a point in GeoGebra, we do it using the program's positioning system. This system is based on the coordinate axes of the program, meaning the points are positioned relative to the x and y axes. However, sometimes we want to place points based on the visible window, for example, to position a point in a visible region.
In summary, to create a positioning system in pixels in GeoGebra, we need to relate two positioning systems: the program's own system (coordinate axes) and the screen system (pixels).
GeoGebra coordinate system
To know the width and height of the screen at any given moment, we need the corners of the active window (in GeoGebra coordinates). We will use the Corner function with numbers from 1 to 4 to get these coordinates.
Execute the following commands in GeoGebra using the input bar:

We have defined the variables corner1, corner2, corner3, and corner4, which represent the four corners of the GeoGebra window. Here’s how they are arranged:

Now that we have the corners, we can calculate the total width of the window, but not in pixels, rather in GeoGebra coordinates.
We do this using the Distance command, which gives us the distance between two points. Again, use the GeoGebra input bar to enter the following commands:
Lastly, we will store the x and y coordinates of the corners. We will use these to position the points correctly.
The corner(5) command creates a point with the window's coordinates. Let’s execute a GeoGebra command to store the window's values.
| Text Only | |
|---|---|
Now we can calculate the height and width of the window with these commands (in pixels):
Conversion factors
Since we want to convert between both positioning systems, we aim to transform the coordinates of a point on the window (and thus in pixels) into a point in GeoGebra coordinates (in the coordinate axes of GeoGebra). Let’s assume we want to place a point at the center of the screen. Since we know the window size in pixels, we know that the point is at:
| Text Only | |
|---|---|
However, this pixel system doesn’t work for positioning the point. We need to convert it into GeoGebra coordinates. Let’s look at the x coordinate x1 of the point and establish the following proportion:
| Text Only | |
|---|---|
Solving for x, we get:
| Text Only | |
|---|---|
It’s easy to see that the conversion factor for converting from pixel coordinates to coordinate axis coordinates is:
| Text Only | |
|---|---|
Similarly:
| Text Only | |
|---|---|
Since both factors are the same, we only need to calculate one of them, which we will call factorXY:
Now execute the following command:
| Text Only | |
|---|---|
Using Javascript
Now you can use Javascript in an element on the stage (for example, a button) to create new elements and position them relative to the window.
// Define the applet
var ap = ggbApplet;
// Get the pixel units
var width = ap.getValue("winHeight");
var height = ap.getValue("winWidth");
// Get the conversion factors calculated in GeoGebra (pixels/real units)
var factorX = ap.getValue("factorXY");
// The corners contain the coordinates of the extremes used to place the point
var valuexcorner1 = ap.getValue("valuexcorner1");
var valuexcorner4 = ap.getValue("valuexcorner4");
// Given two x, y values in pixels, we convert them to GeoGebra coordinates using the following conversion.
x = valuexcorner1 + x * factorXY;
y = valueycorner4 - y * factorXY;
Now that we understand the process, let’s create a simple Javascript function that makes the process easier. The function takes the coordinates of the points x and y in pixels and converts them to coordinate axes coordinates.
function pointToWindow(x, y) {
var point = new Array();
// Define the applet
var ap = ggbApplet;
// Get the pixel units
var width = ap.getValue("winHeight");
var height = ap.getValue("winWidth");
// Get the conversion factor calculated in GeoGebra (pixels/real units)
var factorXY = ap.getValue("factorXY");
// The corners contain the coordinates of the extremes
var valuexcorner1 = ap.getValue("valuexcorner1");
var valueycorner4 = ap.getValue("valueycorner4");
// Given two x, y values in pixels, we convert them to GeoGebra coordinates using the following conversion.
point[0] = valuexcorner1 + x * factorXY;
point[1] = valueycorner4 - y * factorXY;
return point;
}
If you want to create a simple application to generate random points that always appear in the visible area, you can do it with the following code. You can download the example here: GeoGebra Application
var ap = ggbApplet;
// Get the pixel units of the window
var winWidth = ap.getValue("winWidth");
var winHeight = ap.getValue("winHeight");
// Coordinates of the point I want to place on the screen (in pixels)
var x = getRandomArbitrary(0, winWidth);
var y = getRandomArbitrary(0, winHeight);
// Convert to GeoGebra coordinate axes
var point = new Array();
point = pointToWindow(x, y);
// alert(point[0] + " " + point[1]);
// Place the point in GeoGebra
var newPoint = ap.evalCommandGetLabels("(" + point[0] + "," + point[1] + ")");
// Hide the point label
ap.evalCommand("ShowLabel(" + newPoint + "," + false + ")");
function getRandomArbitrary(min, max) {
return Math.random() * (max - min) + min;
}
function pointToWindow(x, y) {
var point = new Array();
// Define the applet
var ap = ggbApplet;
// Get the pixel units
var width = ap.getValue("winHeight");
var height = ap.getValue("winWidth");
// Get the conversion factor calculated in GeoGebra (pixels/real units)
var factorXY = ap.getValue("factorXY");
// The corners contain the coordinates of the extremes
var valuexcorner1 = ap.getValue("valuexcorner1");
var valueycorner4 = ap.getValue("valueycorner4");
// Given two x, y values in pixels, we convert them to GeoGebra coordinates using the following conversion.
point[0] = valuexcorner1 + x * factorXY;
point[1] = valueycorner4 - y * factorXY;
return point;
}