Common input for the two platforms
·
Introduction
v User Interaction with Android / Desktop:
v Text input
v Mouse / Touch input
v Keyboard Input
·
Conclusion
Introduction
We will see the
interaction of the user with the app / game , knowing that we work on both
Android and Desktop platforms, it must be considered that these two have some common
inputs so we can handle with a same code and also others inputs that we should
handle them separately.
We will see in
this part handling inputs available in both platforms and how Libgdx handles
differences between these two, you'll see it's awesome !
User interaction with Android / Desktop
Users of your app
/ game you will probably interact with it, and that through many I/O devices. Handling
user interactions will be in a one source code that works on two different
platforms, so the Libgdx will manage the differences between the two platforms thanks
to one of its modules.
Among these
differences :
On Desktop , we
use the mouse and on Android we use our finger to interact with the screen , so
in Android there is no detection of the movement of the mouse , only the dragging
(the action of moving and keep clicking) mouse is available. On Android we can
handle multi-touch (uses of several fingers at the same time) which is
not possible on Desktop because the mouse has only one cursor.
Regarding the
keyboard, it is clear that the Desktop is considered, but several Android
devices do not adopt the keyboard. To handle this, Libgdx offers to manipulate
a set of unique touch keyboard for all devices.
Knowing also that
on Android there is a set of keys that do not exist in the desktop environment
such as: back button , Home button , search ...
Of course there
are many other differences that Libgdx handle in a unified code valid for both
platforms.
Text input
We'll start with
the simplest, enter text on an application under Android or Desktop.
User of app will
write text through a dialog box.
On
Desktop : A dialog box
kind of Swing will be opened .
On Android
: A Standard dialog
box will be opened.
This is possible
only if it implements the TextInputListener interface that has two
methods :
Input () : This method is called when the user has
entered a string and press OK
Canceled (): This method is called when the user has
closed the dialog box (on Desktop ) or pressed the back button (on Android ) .
* Here's what it
looks like in source code:
public class
AuditorTexte implements
TextInputListener {
@Override
public void
input (String text) {
}
@Override
public void canceled () {
}
}
* Now to display
the dialog you must first instantiate the class AuditeurTexte
AuditorTexte listener = new AuditorTexte();
Then place this
when you want information from user:
Gdx.input.getTextInput(listener, "title", "Initial text ");
Inputs with Keyboard
As we already
said, Libgdx tries to reduce the difference between the two platforms. It is
characterized by the use of these two mechanisms polling and event
management.
The Polling
A keyboard event
triggered by a user ( by pressing or releasing a key) generates a code that
identifies the key concerned.
Android and
Desktop does not have the same code for the keys that why Libgdx offers its own
code for keyboard keys. The mechanism of polling consists of
checking in continuous way if a key has been entered. Polling is a quick and
easy way to handle user input .
* Here's how to
query a key on the keyboard :
boolean isAPressed =
Gdx.input.isKeyPressed(Keys.A);
Event handling:
Key’s code alone
is not sufficient to manage keys pressed simultaneously. For example: to write
uppercase you have to press the shift and the character key . That why an event
listener is connected and detects not only the code for the pressed / released button
but also his character that goes with. This is the mechanism for managing
events, it helps to have more information on user input.
* For event handling
we must implement the InputProcessor class and redefine all its methods.
public class
InputProc implements
InputProcessor {
//****** these three methods are for key
inputs *******//
@Override
public boolean
keyDown(int codeCle) {
return false;
}
@Override
public boolean
keyTyped(char caractere) {
return false;
}
@Override
public boolean
keyUp(int codeCle) {
return false;
}
//*****************************************************************************//
//***
these methods are for mouse/touch ***//
//we will see them after this section
@Override
public boolean
scrolled(int arg0) {
return false;
}
@Override
public boolean
touchDown(int arg0, int
arg1, int arg2, int
arg3) {
return false;
}
@Override
public boolean
touchDragged(int arg0, int
arg1, int arg2) {
return false;
}
@Override
public boolean
touchMoved(int arg0, int
arg1) {
return false;
}
@Override
public boolean
touchUp(int arg0, int
arg1, int arg2, int
arg3) {
return false;
}
}
We will interest on
the first three methods with the keyboard we will see others in the Mouse /
touchscreen section that come afterwards.
These first three
methods allow listening to keyboard events.
• keyDown () :
Called when a key was pressed, takes as a parameter the code of the pressed key
.
• keyUp ():
Called when a key was released , the code takes a parameter of the released key
.
• keyTyped ():
Called when a Unicode character was generated by a keyboard input , takes as a
parameter the character generated .
* Once the interface
is implemented and methods defined , we must instantiate the class that
implements this interface :
InputProc
myInputProc = new InputProc();
Gdx.input.setInputProcessor(myInputProc);
Now, each event
can be detected.
Inputs with Mouse / Touchscreen
It quoted a
little higher some differences between both platforms concerning the mouse (
for Desktop ) and touching screen ( for Android) . To reduce the gap between
Android and Desktop Libgdx uses the same mechanisms seen for keyboard polling,
and event management.
The Polling
Polling under
Libgdx provided some of methods to check the current status of the inputs, for
example : if the screen is touched by one or more fingers on Android or if we
made a mouse click on Desktop device.
Mechanism polling
consist of checking whether an entry with mouse / touch screen has been made.
Polling is a quick and easy way to handle user input.
* Here's how to
check if one or more fingers are on the touchscreen ( On Android ) .
boolean isTouche = Gdx.input.isTouched();
On Desktop: isTouche
is true if you click on the screen.
* To obtain the
coordinates of a mouse click / finger on the screen :
float X = Gdx.input.getX();
float Y
= Gdx.input.getY();
* You can also
get the distance between a first and a second click :
float deltaX = Gdx.input.getDeltaX();
float
deltaY = Gdx.input.getDeltaY();
* Now to handle
the multi-touch Android
boolean firstFinger = Gdx.input.isTouched(0);
boolean secondFinger = Gdx.input.isTouched(1);
boolean thridFinger = Gdx.input.isTouched(2);
It is clear that
the multiple touches are only considered by Android. Libgdx handles this by
assigning a pointer to each finger touches the touch screen. If a finger
touches the screen it gets smaller index available from [ 0, 1 , 2]. An index
finger will released when that finger stop touching screen. Here's how things
are going :
First finger
touches the screen ---- > it gets the index 0
Second finger
touches the screen ---- > it gets index 1
Third finger
touches the screen ---- > it gets index 2
Third finger left
( release ) screen ---- > index 2 is released .
Second finger
left ( release ) screen ---- > index 1 is released .
A new finger
touches the screen ---- > it gets index 1 ( the lowest available index).
* Find out if the
screen has been touched and released by one of three fingers :
boolean
touche = Gdx.input.justTouched();
* For coordinates
of fingers on the screen
int firstFingerX = Gdx.input.getX();
int firstFingerY = Gdx.input.getY();
int SecondFingerX = Gdx.input.getX(1);
int SecondFingerY = Gdx.input.getY(1);
int thridFingerX = Gdx.input.getX(2);
int thirdFingerY = Gdx.input.getY(2);
You probably
notice that by default the index pointer is 0, it indicates the mouse pointer on
Desktop.
* It is also
possible to obtain the distance between the position of a first finger and a
second
float firstFingerDeltaX = Gdx.input.getDeltaX();
float firstFingerDeltaY = Gdx.input.getDeltaY();
float secondFingerDeltaX = Gdx.input.getDeltaX(1);
float secondFingerDeltaY = Gdx.input.getDeltaY(1);
float thirdFingerDeltaX = Gdx.input.getDeltaX(2);
float thirdFingerDeltaY = Gdx.input.getDeltaY(2);
* Find out what
is the mouse button that the user just clicked :
boolean leftButton = Gdx.input.isButtonPressed(Input.Buttons.LEFT);
boolean rightButton = Gdx.input.isButtonPressed(Input.Buttons.RIGHT);
boolean middleButton
= Gdx.input.isButtonPressed(Input.Buttons.MIDDLE);
Event Management :
The mechanism of
event management gives us more information on user input. It provides a way to handle
interactions with the user interface that are more difficult to do using only
the polling.
* For event
management we must implement the interface InputProcessor and redefine
all its methods .
public class
ProcesseurDentree implements
InputProcessor {
//****** these three methods are for key
inputs *******//
// seen in the last section
@Override
public boolean
keyDown(int codeCle) {
return false;
}
@Override
public boolean
keyTyped(char caractere) {
return false;
}
@Override
public boolean
keyUp(int codeCle) {
return false;
}
//*****************************************************************************//
//***
these methods are for mouse/touch ***//
@Override
public boolean
scrolled(int arg0) {
return false;
}
@Override
public boolean
touchDown(int arg0, int
arg1, int arg2, int
arg3) {
return false;
}
@Override
public boolean
touchDragged(int arg0, int
arg1, int arg2) {
return false;
}
@Override
public boolean
touchMoved(int arg0, int
arg1) {
return false;
}
@Override
public boolean
touchUp(int arg0, int
arg1, int arg2, int
arg3) {
return false;
}
}
The methods
providing event management for mouse / touch screen
• touchDown ()
: Called when a finger is on the screen ( for Android) / mouse button was
pressed ( for Desktop) . It informs us about the coordinates and the index
pointer and the mouse button. Knowing that for Android only the right button is
considered.
• touchup () : Called when a finger was lifted from the
screen ( Android) / a mouse button has been released (Desktop) . It informs us
about the coordinates, the index pointer ( Android) and mouse button (Desktop).
• touchDragged ()
: Called when a
finger slides on the touch screen (Android) / move the mouse while pressing a
mouse button (Desktop) . Informs us the coordinates and the index pointer, but to
know what is the mouse button concerned we must use the polling.
• touchMoved
() : Called when the mouse is moved over the screen without one of its
buttons is clicked. This method is only called on the Desktop, it will never be
called in for Android, touchscreen does not have cursor as is the case of the
mouse.
• scrolled ():
Called when the mouse wheel is rotated. Informs us about the direction of the rotation
of the wheel of the mouse. It is never called on Android.
* Once the interface
implemented and methods defined , we must instantiate the class that implements
this interface :
InputProc
myInputProc = new InputProc();
Gdx.input.setInputProcessor(myInputProc);
Once the object
indicated each event can be detected.
Problems ?
If you have any comments or questions please post them in the comments section . Thank you for reading.
See also !