This post is an update or an extension to the post 2012/07/10 - [OpenGL] When textures do not show properly without any glerror in Android posted yesterday.


Previously, I found out that for some cases if the image resource is packaged in res/drawable it will cause some problems to create textures. In spite of this discovery, for some images the texture still did not render at all on some devices(namely the Galaxy Player). The research went on.


The research continued on the simple example I have created which I have mentioned in my previous note. The image that wasn't working as a texture was a 1024x512 size image of the earth's surface. Once again, I checked the bitmap's internal format, type, and config. However, I could not see any difference from the case where I was using a different image that works.


The blame now went to the dimension of the image. The image that worked as a texture on the Galaxy Player had a size of 512x512. Could it be possible that it works because it has a dimension of a square? So I scaled the image that isn't working as a texture down to 512x512 and it worked(I was wrong. It still didn't work. I got my situation messed up. updated 2012/7/16). But that didn't make much sense since I wasn't able to find any requirements about textures to have a square dimension. Moreover, OpenGL ES 2.0 specifies that the dimensions of the image does not have to be in the power of two.


Stripping out the power of two requirement and putting in a requirement that a image used for a texture must be a square sounds ridiculous. Also, then why is it working on other devices?


More research went on and I finally found out that the malfunctioning has something to do with setting the GL_TEXTURE_MIN_FILTER. It turns out that on Galaxy Player, the texture does not properly render if the GL_TEXTURE_MIN_FILTER is set to a mipmap filter(either linear or nearest) when the dimension of the image is not a square.


I'm pretty obvious that this is an OpenGL implementation bug on the device, but I had to check. So, I posted a question about this on stackoverflow. If you have any useful information to tell me, please leave a comment here or on stackoverflow. 


Thanks.


Posted by Dansoonie

While I was exploring the features of Rajawali creating some sample code, I have encountered into a strange situation where textures show up on one device and another didn't. Nothing complicated in the sample code going on. Just rendering a sphere object with a texture of the earth's surface. The image used for the texture was saved in res/drawable and the bitmap was created at runtime using BitmapFactory.decodeResource(). Now the most strange thing was that glError was not flagged at any point(at least I think I checked thoroughly).


FYI, the working devices was Galaxy Nexus, and the non-working device was Galaxy Player GB70


To attack this issue, I've created a simple project which renders a flat square with the image that I'm having trouble to use it as the sphere's texture in Rajawali.


The first thing I noticed was that the only difference between the working device and the non-working device was that the image was decoded into a ARGB8888 bitmap config on the working device and RGB565 bitmap config on the non working device. However, if I force the image to be decoded into RGB565 bitmap config on the working device, it still worked.


I've tried changing parameters for glTexImage2D and converting the image file to use another bitmap config(via Bitmap.copy()) and all sorts of things without much luck. So I did more Googling to do more research.


There is probably almost any information you want on the Internet. And I have found the reason why the texture was not showing properly. A piece of meaningful information here. The person who was having a similar problem that I was having posted a question on stackoverflow. Luckily he found the solution on his own and was nice enough to share the information he learned. Special thanks to him/her.



In Android, image resources could be packaged in path res/drawable. Since there exists many Android devices with different screen resolution the image resources are designed to be packaged in various size in drawable-ldpi, drawable-mdpi, drawable-hdpi under res/. And for the sake of convenience you can package resources under simply res/drawable and then the system would automatically handle the resizing. Here's a quote from the Android developer's page regarding supporting multiple screen resolutions.


The "default" resources are those that are not tagged with a configuration qualifier. For example, the resources in drawable/ are the default drawable resources. The system assumes that default resources are designed for the baseline screen size and density, which is a normal screen size and a medium density. As such, the system scales default density resources up for high-density screens and down for low-density screens, as appropriate.


http://developer.android.com/guide/practices/screens_support.html


This is something that I wasn't completely unaware of, but it bit me. The problem might have been when the resource was decoded into a bitmap using the BitmapFactory, the size of the image changes into probably something not in the dimensions of power of two. The OpenGL ES 2.0 specification indicates that it supports non-power of two textures. See the OpenGL ES 2.0 common profile specification  p. 17 on section 3.8 Texturing. However, for some reason I'm suspicious about every OpenGL ES 2.0 implementation strictly following this specification. 


What I didn't really know was that drawable resources under res/drawable-nodpi is dpi independent resource which the system does not perform any resizing when decoded into bitmap. Honestly, I thought drawable resources under res/drawable would be decoded in a dpi independent manner too.

 

I'll have to see if the problem was caused by resizing the resource into a non-power of two dimension. If this were true I'm also surprised that glError was not flagged at all. Anyway, but for now, if you are having trouble loading textures in Android check if your drawable resource that you are using as your texture is packaged under res/drawable.


Problem partly solved, but still the texture is not showing up for some cases on the sphere when using Rajawali. so the research goes on...

If you have any knowledge about this problem or if I have written something incorrect here please leave a comment and let me know.


Posted by Dansoonie
몇일전에 그동안 회사에서 제가 개발에 참여한 Android Launcher인 Regina 3D Launcher가 Android Market에 올라갔습니다. 나름대로 우리의 제품에 대한 자부심도 있지만, 적은 인력으로 어려운 환경 가운데서 개발중인 launcher이기에 다른 상용 launcher에 비해 약간 부족해 보여 얼마나 큰 성과를 거둘 수 있을지 의심스러웠는데, 생각보다 상당히 큰 관심을 불러일으키고 있는것 같습니다.

처음에는 Twitter와 Facebook으로 주로 지인들에게 홍보를 했지만, 올린지 이틀만에 예상치도 못한 곳곳에서 우리의 launcher에 대한 리뷰가 올라온 사실이 꽤 신기했습니다. 독일과 홍콩의 인터넷 사이트에서도 리뷰가 된것은 정말 신기했습니다...


그 밖에도 xda developers forum(참고 - Regina 3D Launcher가 언급된 글) 같은 여러 forum에서도 Regina 3D Launcher가 언급되기도 하고 있습니다.

오늘 아침에 기록된 누적 다운로드 수가 1500건을 넘어섰고, 여러 Android Market에 기생해 사는 거의 Market의 내용 그대로 복사해다가 다운로드 중개하는 여러 사이트들에서 또 다운로드를 받을 수 있는 경우도 많이 확인했기 때문에 다운로드 수는 더 상당할 것으로 생각됩니다...

오늘(13일)은 전 세계 여러곳에서 여러가지 문의사항 및 피드백, 그리고 오류보고를 받아서 일일이 답장해주고 궁금증을 풀어주는데 거의 하루를 보낸것 같습니다. 그리고 더 놀라웠던 문의하는것도 아니요, 피드백을 주는것도 아니요, 오류를 보고하는것도 아닌 순수 격려와 칭찬의 메일을 주신 분들도 계시다는 것입니다.

그동안 상용 소프트웨어의 세계, 더더구나 스마트폰 앱 시장의 세계는 피터지고 무정한 곳이라고만 생각했는데, Android Market에서는 순수한 개발자의 열정으로 서로 격려해주고 인정해주는 훈훈한 분위기도 조성되어있는 모습을 발견하게 되어 신선했습니다.

어쨌든, 제가 참여한 일이 전 세계적으로 조금씩 관심을 받기 시작하니 기분이 살짝 좋아지기도 하고 앞으로 사람들의 요구사항을 제때 들어주고 적용할 수 있을지에 대한 걱정도 커지네요...

어쨌든, 사용자들과의 원활한 소통을 위해서 Regina 3D Launcher Facebook Page도 오픈하였습니다. 관심있으신 분은 거기서 최신 소식과 궁금증을 해결하고 기타 좋은 정보를 공유할 수 있는 공간이 되었으면 좋겠습니다. 기본적으로 Regina 3D Launcher는 전 세계를 무대로 생각하고 내놓았기 때문에 Facebook page에서 사용되는 언어는 영어입니다. 그 점은 양해해 주시기 바랍니다.

앞으로 작은 소망이 있다면 네이버에 Regina 3D Launcher(레지나 런처)가 검색순위에 오르는 것이고, 또 다른 소망은 Regina 3D Launcher가 Wikipedia에 등재되는 것입니다... 음하하하하~ 

 
Posted by Dansoonie
제가 개발에 참여한 commercial software (yet free)가 제 인생 처음으로 public에 Android Market을 통해 공개되었습니다. Android용 Launcher로 기존에 나온 많은 launcher들과 다르게 3D 그래픽을 사용하여 사용자들에게 다양한 시각적 효과를 통해 재미를 줄 수 있는 launcher가 아닌가 감히 말하고 싶습니다. 이름하여 NemusTech, Tiffany 팀의 (아직 많이 부족하지만) 야심작... Regina 3D Launcher.

Regina 3D Launcher 는 제가 다니고 있는 회사에서 개발한 3D GUI Framework 인 Tiffany를 사용해서 만든 launcher입니다. 개발과정 처음부터 참여하여 launcher의 기본 구조 설계 작업과, 3D launcher를 만들기 위한 Tiffany를 개량하는 작업을 주로 맡아서 했기에 애착이 많이 가는 작품입니다.

그래서... Regina Launcher의 특징으로는... 
* 이미 말했듯이, 3D  그래픽을 사용하는 launcher이고.
* 그럼에도 불구하고 Android widget을 지원하고
* 3D 그래픽을 사용하나 재미난 시각적 효과들이 많이 있고
* Workspace 이동을 좀더 재미있고 편리하도록 하였고
* Workspace에 이름을 지정해 줄 수 있고
* Application을 shortcut이나 application list에서 직접 uninstall이 가능하고
* 다른 launcher들과는 다르게 shortcut이나 widget의 위치 설정이 매우 자유롭고
* 사생활 보호를 위한 secret workspace 라는 기능이 있으며
* 기본 launcher와 다르게 workspace 별로 wallpaper를 독립적으로 설정할 수 있습니다.

이것이 Regina 3D Launcher 사용하는 실제 데모영상입니다.


Regina 3D Launcher 개발한 Tiffany 팀 팀원과 디자인에 도움을 준 디자인 팀 모두 능동적으로 참여했기에 기대 이상의 성과를 이룬것도 사실이지만, 개인적으로는 제가 가지고 있는 욕심을 채울 수 있을 정도로 부지런히 일하지 못한것 같아 아쉬움이 많이 남습니다. 개발에 도움을 주시고 테스트에 참여해주신 NemusTech 식구들 고생 많이 많았습니다... 이젠 사용자들의 심판을 받아야 할때...

관심 있으신 분들은 https://market.android.com/details?id=com.nemustech.regina에 가서 확인해 보시고 다운받아 사용해 보시기 바랍니다. 무료입니다~ 

바로 다운로드 받으시고 싶으신 분은 다음 QRcode를 스캔하시면 바로 마켓으로 안내해줍니다.

  
Posted by Dansoonie
My previous post was about unexpected behavior of Buffer's in Honeycomb which seemed like a bug. The bug I found was about float values in cloned read-only FloatBuffer being interpreted differently from the contents in the original FloatBuffer(2011/04/07 - Unexpected behavior of Buffers in Honeycomb(Android 3.0)). I submitted a bug report to Google and it was confirmed as a bug and now fixed internally for future IceCreamSandwich release.

Please refer to http://code.google.com/p/android/issues/detail?id=15994 for more information. My first successful bug report on a commercial product. Yay~

I thank all my colleagues at work who gave me this opportunity and helped me tackle this issue. 
Posted by Dansoonie
I am an android software developer. At work I am developing a 3D GUI framework called Tiffany. OpenGL is used at the core of Tiffany. Like any other OpenGL application, our product uses float type values to define the location(position) of the vertices that consist 3D objects. Therefore, FloatBuffers are used frequently.

Tiffany has been working great until now with all Android versions. However, I recently got a report that Tiffany behaves a bit weird on Honeycomb, that is Android 3.0 which is an Android version for tablet devices. I was able to track down the cause of the problem and found out what was going on with the help of Mr. Shin, whom we think of as a genius. The unexpected behavior was originating from ByteBuffer/FloatBuffer.

In a portion of our code there was something going on like the following.
ByteBuffer byteBuffer =
ByteBuffer.allocateDirect(n*4).order(ByteOrder.nativeOrder())
FloatBuffer buffer = byteBuffer.asFloatBuffer();

//...

//We put some float values in the buffer

//...

FloatBuffer copiedBuffer = buffer.asReadOnlyBuffer();

//...

//use the values in copiedBuffer

//... 


As I was debugging the code line by line, I found out that the values retrieved from copiedBuffer were interpreted incorrectly in Honeycomb. This was a very unexpected behavior as this was working perfectly on previous Android versions.

Here is what was happening. Buffers in Android has a property called Order. This property indicates whether the buffer uses big endian or little endian. In other words it defines how the bytes in the buffer will be interpreted. It turns out that this property is altered in the copied version of the buffer when using asReadOnlyBuffer(). And what is more interesting is that this problematic phenomenon is only spotted when the ByteOrder of the ByteBuffer is specified using the method order(ByteOrder byteOrder) from ByteBuffer.

Here is a simple example which illustrates this problem.

ByteBuffer byteBuffer0 =

ByteBuffer.allocateDirect(4).order(ByteOrder.nativeOrder());

FloatBuffer buffer0 = byteBuffer0.asFloatBuffer();

buffer0.put(0.1f);

FloatBuffer copiedBuffer0 = buffer0.asReadOnlyBuffer();

Log.d(TAG, "buffer0 endian: " + buffer0.order());

Log.d(TAG, "buffer0[0]: "  + buffer0.get(0));

Log.d(TAG, "copiedBuffer0 endian: " + copiedBuffer0.order());

Log.d(TAG, "copiedBuffer0[0]: " + copiedBuffer0.get(0));


ByteBuffer byteBuffer1 =

ByteBuffer.allocate(4).order(ByteOrder.nativeOrder());

FloatBuffer buffer1 = byteBuffer1.asFloatBuffer();

buffer1.put(0.1f);

FloatBuffer copiedBuffer1 = buffer1.asReadOnlyBuffer();

Log.d(TAG, "buffer1 endian: " + buffer1.order());

Log.d(TAG, "buffer1[0]: "  + buffer1.get(0));

Log.d(TAG, "copiedBuffer1 endian: " + copiedBuffer1.order());

Log.d(TAG, "copiedBuffer1[0]: " + copiedBuffer1.get(0));


FloatBuffer buffer2 = ByteBuffer.allocateDirect(4).asFloatBuffer();

buffer2.put(0.1f);

FloatBuffer copiedBuffer2 = buffer2.asReadOnlyBuffer();

Log.d(TAG, "buffer2 endian: " + buffer2.order());

Log.d(TAG, "buffer2[0]: "  + buffer2.get(0));

Log.d(TAG, "copiedBuffer2 endian: " + copiedBuffer2.order());

Log.d(TAG, "copiedBuffer2[0]: " + copiedBuffer2.get(0));


FloatBuffer buffer3 = ByteBuffer.allocate(4).asFloatBuffer();

buffer3.put(0.1f);

FloatBuffer copiedBuffer3 = buffer3.asReadOnlyBuffer();

Log.d(TAG, "buffer3 endian: " + buffer3.order());

Log.d(TAG, "buffer3[0]: "  + buffer3.get(0));

Log.d(TAG, "copiedBuffer3 endian: " + copiedBuffer3.order());

Log.d(TAG, "copiedBuffer3[0]: " + copiedBuffer3.get(0));


The result in Honeycomb(Android 3.0) AVD would look like the following.

buffer0 endian: LITTLE_ENDIAN

buffer0[0]: 0.1

copiedBuffer0 endian: BIG_ENDIAN

copiedBuffer0[0]: -4.2949213E8


buffer1 endian: LITTLE_ENDIAN

buffer1[0]: 0.1

copiedBuffer1 endian: BIG_ENDIAN

copiedBuffer1[0]: -4.2949213E8


buffer2 endian: BIG_ENDIAN

buffer2[0]: 0.1

copiedBuffer2 endian: BIG_ENDIAN

copiedBuffer2[0]: 0.1


buffer3 endian: BIG_ENDIAN

buffer3[0]: 0.1

copiedBuffer3 endian: BIG_ENDIAN

copiedBuffer3[0]: 0.1


The result in Android 2.X AVD would look like the following.

buffer0 endian: LITTLE_ENDIAN

buffer0[0]: 0.1

copiedBuffer0 endian: LITTLE_ENDIAN

copiedBuffer0[0]: 0.1


buffer1 endian: LITTLE_ENDIAN

buffer1[0]: 0.1

copiedBuffer1 endian: LITTLE_ENDIAN

copiedBuffer1[0]: 0.1


buffer2 endian: BIG_ENDIAN

buffer2[0]: 0.1

copiedBuffer2 endian: BIG_ENDIAN

copiedBuffer2[0]: 0.1


buffer3 endian: BIG_ENDIAN

buffer3[0]: 0.1

copiedBuffer3 endian: BIG_ENDIAN

copiedBuffer3[0]: 0.1

 
So, here is my conclusion. Dalvik uses big endian and Linux which is the operating system I am using at work uses little endian. As a result, ByteBuffers are created to use big endian by default. However, when the ByteOrder is specified to be little endian, the Order property isn't properly copied to the new Buffer in Honeycomb. I suspect that this is a bug in Honeycomb because Honeycomb is the only Android version working differently and also it doesn't logically make sense to use a different endian system to interpret a copied buffer from the original buffer. Moreover, the Order property not being properly copied seems much like a mistake since you cannot set the Order property for FloatBuffers.

I must admit that specifying the ByteOrder of the buffer is an unnecessary step, still Honeycomb's behavior of handling Buffers doesn't make much sense.
Posted by Dansoonie
Are you developing an Android app that requires a long click, tap, touch or whatever? Then you should be adding an OnLongClickListener to your view. And if you are also going to carry out complex tasks with touch events, you might be adding an OnTouchListener to your view too! Same thing what I was doing at work. unfortunately, I was having a weird problem. Obviously, I was expecting the OnLongClickListener to capture the long click event only when I press my finger on the view for a certain amount of time. However, OnLongClickListener's onLongClick() was being called every time there was a touch event along with OnTouchListener's onTouch().

I was googling to find out why, eagerly seeking for a solution. Surprisingly, it seemed like there weren't many people having the same problem (or maybe I wasn't using the right keyword). Eventually I found out what was causing the problem, I have decided to share the experience. Not that it is something very unusual, but for those who are looking for a quick answer.

The problem was in my implementation of onTouch() in OnTouchListener. Traditionally (way back from Windows programming), when you are dealing with events, you would return true when the event handler handles the event, or more precisely when the event handler consumes it and return false when the event is meaningless to the event handler so that it passes on the event to the next available event handler according to the hierarchy of the user interface. This convention I was following was causing the problem.

In order for Android to detect long clicks, Android must keep track of the time after a touch down event has occurred. After reading the Android developer's document and third QnA threads from party forums I have reached to a conclusion that this time checking is taking place at a somewhat unexpected place. Honestly, I don't have much experience in Windows programming, so I'm not really sure if it's the same case for Windows programming, but in order to capture long click events correctly the onTouch event must return false on ACTION.DOWN events. 

To explain in detail, let's take a look at some code.

public class LongClickTest extends Activity {

public class TestView extends View {


final static private String TAG = "LongClickTest";

public TestView(Context context) {

super(context);

this.setOnTouchListener(mTouchListener);

this.setOnLongClickListener(mLongClickListener);

}

private OnLongClickListener mLongClickListener = 

new OnLongClickListener() {


@Override

public boolean onLongClick(View view) {

Log.d(TAG, "LongClick !!!");

return true;

}

};

private OnTouchListener mTouchListener = 

new OnTouchListener() {


@Override

public boolean onTouch(View view, MotionEvent event) {

boolean consumed = false;

switch(event.getAction()) {

case MotionEvent.ACTION_DOWN:

Log.d(TAG, "ACTION_DOWN");

consumed = true;

break;

case MotionEvent.ACTION_MOVE:

Log.d(TAG, "ACTION_MOVE");

consumed = true;

break;

case MotionEvent.ACTION_UP:

Log.d(TAG, "ACTION_UP");

consumed = true;

break;

}

return consumed;

}

};

}

    /** Called when the activity is first created. */

    @Override

    public void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);

        LayoutParams layoutParams = 

new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT);

        setContentView(new TestView(this), layoutParams);

        

    }

}


Here I have made a inner class inside of an Activity class for convenience. The inner class called TestView inherits View and an instance of that TestView is set as the content view of the Activity class. Inside the inner class, I have implemented the OnTouchListener and OnLongClickListener to log about the occurring event and set each listeners to the corresponding listeners. If you execute the program, what you'll see is a blank black empty screen. But it's sufficient for testing.

First, let's try executing the above code as it is. In logcat, all you will see are logs of ACTION_DOWN, ACTION_MOVE, and ACTION_UP events. onLongClick() in OnLongClickListener is never triggered. As I have mentioned earlier, this is because the OnTouchListener has consumed all the events and Android has no chance of keeping track of how long the time elapsed before ACTION_UP after the ACTION_DOWN.

So, let's make some change in the OnTouchListener's onTouch() like the following.

@Override

public boolean onTouch(View view, MotionEvent event) {

boolean consumed = false;

switch(event.getAction()) {

case MotionEvent.ACTION_DOWN:

Log.d(TAG, "ACTION_DOWN");

consumed = false;

break;

case MotionEvent.ACTION_MOVE:

Log.d(TAG, "ACTION_MOVE");

consumed = true;

break;

case MotionEvent.ACTION_UP:

Log.d(TAG, "ACTION_UP");

consumed = true;

break;

}

return consumed;

}



 By returning false in case of ACTION_DOWN events, Android is now able to keep track of the duration of time after the ACTION_DOWN occurs. As a result, after a decent amount of time (about 1 second) after your fingertip touches the screen, you will see the log "LongClick !!!" in logcat. However, the problem is that you will find out that onLongClick() will be called every time you touch the screen leaving the log "LongClick !!!" in spite of the fact that you have lifted up your fingertip from the screen. This is the problem I was having. I have to admit that the code I was working on was poorly written, because my original intention was to return true whenever the OnTouchListener spotted a ACTION_DOWN event and handled it. Anyway, this was the flaw in my code and it was causing the problem that I am trying to illustrate here. The problem is that Android is now able to measure the amount of time passed after the ACTION_DOWN event, but it never knows when to stop measuring. This is because my OnTouchListener has consumed the ACTION_UP event and hence, Android has no clue when the ACTION_UP has occurred. Do you see what is going on???

Therefore my conclusion here is to return false no matter what. In this case, returning true for ACTION_DOWN and ACTION_UP will be enough to solve the problem. However, if we have to deal with more complex touch event sequence, it should return false for all events so that Android will be able to capture the event whatever is needed to trigger other events. Some other event's might require ACTION_MOVE to be captured outside the OnTouchListener you are implementing.

Therefore, the resulting code shall be the following.

@Override

public boolean onTouch(View view, MotionEvent event) {

switch(event.getAction()) {

case MotionEvent.ACTION_DOWN:

Log.d(TAG, "ACTION_DOWN");

break;

case MotionEvent.ACTION_MOVE:

Log.d(TAG, "ACTION_MOVE");

break;

case MotionEvent.ACTION_UP:

Log.d(TAG, "ACTION_UP");

break;

}

return false;

}


So this is a simple example demonstrating why your onLongClick() is called along with onTouch(). I haven't put much thought to this problem to determine whether this is a good design or not, but I don't like this event handling method at the moment. If it were to operate this way, why is there the need to return a boolean in the OnTouchListener in the first place...

Anyway, this is how Android is, and can't blame the dudes in Google because they probably know what they are doing...

Read http://developer.android.com/guide/topics/ui/ui-events.html for more information, especially the part where is starts with "onTouch:" and "note:"(<- be aware of the colon included. I've added that to explicitly indicate where the information is).
Posted by Dansoonie
As far as I know the Android emulator does not provide simulation of hardware sensors. I spent plenty of time searching for options to throw hardware sensor events to the emulator in Eclipse and the Android developers site. I came accross in the Android developer's guide where it describes means to send events to the emulator, but the explanation was so brief I had no idea what to do. Also, I'm not sure if that was the part which describes what I want to acheive. In other words, it isn't so clear in any of the reference I haved looked about how you can simulate the hardware sensors in the emulator. For this reason, if you are developing an Android application which operates based on the orientation(or acceleration) of the device, you might be having trouble debugging your app.

So be it... I was so sure there must be some way to do it. After googling a while, I was led to openintents' google code project page. There I was able to find a sub-project called "Sensor Simulator". It was exactly the thing I was looking for, and it served its purpose right. I was able to simulate hardware sensor and send events to the Android emulator using it. The wiki page in the goole code project site seems to be outdated and you must make modifications to the resulting code of the instructions listed there in order to get it work. So I thought I should share the information about how I managed to make it work along with my understanding of how the thing works since the document on the wiki page delivers partly false instructions.


The distributed zip file of Sensor Simulator contains the hardware sensor simulator server, client, and an external jar library file. The hardware sensor simulator server is where you can change the orientation of the device in a GUI environment and generates the sensory input values. The client is an Android package that must be installed in the emulator(the Android device) which will read generated sensory input from the hardware sensor simulator server. The external jar library file consists of classes that provide an abstract layer to the SensorManager for reading sensory input values generated from the sensor simulator client which was read from the server.

To reduce confusion for future readers, I might as well aknowledge the version I downloaded and worked on for the people who might refer to this post later on when there is an updated version available and this post becomes invalid. But it's already been a while since the last update was done to the distributed zip file which was in July 2009. Everything is working quite well, and I see no reason for future updates in the near future unless there is a major change in the Android class architecture that relates to hardware sensors. Anyway, The version I worked with is 1.0.0-beta1 on Android SDK 2.0.


The Sensor Simulator wiki page can be found here.
And the distribution file I downloaded can be found here.

The following instructions are based on the project's wiki page, modified and added instructions and their description is based on my personal experience.

1. Installing the client in the Android emulator
Install SensorSimulatorSettings.apk in the bin directory to the Android emulator. I'm pretty sure there is a way to install Android packages from the DDMS perspective in Eclipse, but I personally prefer using the adb tool from the command prompt. 
adb install SensorSimulatorSettings.apk
You might have to add options to adb if you have multiple devices connected or emulators running. 

2. Execute the server and configure the cilent to make a connection
Execute sensorsimulator.jar in the bin directory which is the executable jar file of the server. The following is a screenshot of the server application. (The appearance of the server app may look weird as the layout is all messed up as the textboxes appear smaller than they are supposed to. You can correct this by resizing the window)

<An instance of the Sensor Simulator Server Application>

 
Scroll up the coresponding textbox which is marked with a red rectangle in the above picture in the server application. As you scroll up, you may see the possible IPs that the client can use to identify and connect to the server. Do not quit the server application.



Now start the Sensor Simulator client which you installed in your Android emulator by installing the sensorsimulatorsettings.apk to your emulator. The client Android app will look like the screenshot on the right.

Enter one of the possible IPs that was shown in the server application in the IP address field. If you did not make any modifications to the port number from the server application, you can use the default value 8010 which is the default value for the port number being used for the server application.






Then go to the Testing tab so you will see something like the screenshot on the left. Press the connect button and select the sensors you will like to retreive infromation from. The order of pressing the connect button and selecting the sensors does not seem to matter. Since the purpose of this post is demonstrating how to simulate hardware sensors, I will be enabling the orientation sensor only.

Now play around with the server. You can change the orientation of the device by clicking and dragging the mobile device looking like figure or by using slide bars. See if the values change in the client where it is marked with a red rectangle in the left picture.



If you can see the values change in the Android emulator by playing around with the server application, it means that the Android emulator is successfully retrieving the simulated hardware sensory input values.


3. Modifying the code to use simulated hardware sensory input
A portion of your application will have something similar to the following code if you were developing an Android application that reads hardware sensor values. The highlighted part is subject to change later on. This code is not tested on an actual device, so I will not guarantee it will work. But I have checked it is buildable. Also, keep in mind that this code contains a lot of deprecated API which I did not bother to make changes. I was referring to an outdated book published in the days of Android SDK 1.5 to make this quick example.

Here are the changes you must make to use the sensory values from the Sensor Simulator to simulate hardware sensors.

  1. The type of mSensorManager must be changed to SensorManagerSimulator. Create a SensorManagerSimulator instance by using the static method SensorManagerSimulator.getSystemServices(Context, String).
  2. Connect SensorManager to the simulator by calling static method SensorManagerSimulator.connectSimulator() before registering SensorListener.
The resulting code may look like the following. The lines highlighted in pink are the newly added lines and the yellow highlighted lines are lines that are modified.

There isn't much to modify after all. Here is how the instructions here are different from the project wiki page.
 The instruction where it says to retrieve the content resolver and create an Intent instance to start an Activity with it is left out because it seems to be unnecessary code. It explains as if it is a very important step, but it causes syntax errors and I have no idea how to interpret in order to make proper modifications. And it still works without those lines. 
 The step where unregistering the SensorListener before registering a new SensorListener from the wiki page is also left out here because I see no reason to do that in my simple example. Maybe that step might be required for some special cases, but I don't think that should happen too often.
 The way how the SensorManagerSimulator instance is created different. SensorManagerSimulator constructor is now private and the instance can be only created via the static method getSystemService().


4. The result

<You can see the simulated hardware sensory input value displayed in the TextView>


if your application stops unexpectedly, review the logs from logcat. See if you find anything similar to the following phrase from your log,

"Permission denied (maybe missing INTERNET permission)."

If this is the case, add the following line right before </manifest> at the end of AndroidManifest.xml.

<uses-permission android:name="android.permission.INTERNET"/> 









Posted by Dansoonie
한동안 내 자신의 모습을 잃고 살았다...

내 뜻과는 너무나 다르게 내 인생은 흘러가고만 있었고, 점점 무기력해지고 뭐든지 의욕이 없었다. 시간적인 여유가 없어진 탓도 있겠지만, 더 큰 이유는 목표의식이 없어져 뚜렷하게 하고 싶은 일이 없고, 사회 생활이 내가 생각했던것과 너무 달라서 많이 실망한 나머지 너무 비관적으로 살아가고 있기 때문인것 같았다.

앞으로 펼쳐질 내 인생에 대한 큰 기대감도 이젠 없고, 내게 흥미진진한 일도 일어나지 않고...
기다려지는 일도 없기에, 하루하루를 그냥 보내버린다...

출근하고 일하고 퇴근하고... 하고싶은게 있어도, 야근이 잦고 퇴근시간이 불규칙한 우리나라에서는 시간 쪼개서 취미생활을 하기가 힘들다... 다른 사람들은 역시 내가 게으르기 때문이라고 하겠지만, 내가 다니는 회사에서는 정말 눈치 안보면서 그런 생활하기는 많이 힘들다는게 나와 내 동료들의 견해다... ㅡ.ㅡ;

어쨌든... 나도 이대로 계속 살 수 없다는 생각에 조금이라도 생산적으로 살기로 마음먹었다. 그리고 친구와 대화를 하던 중에 Google에서 Android용 application을 공모를 한다는 소식을 들었다...

그래서 결정해버렸다... 당분간 Android에 매진하기로 했다...
지금 친구랑 application 개발을 위한 아이디어를 구상중이다...
핸드폰에 들어가있으면 좋을것 같은 application이 뭐가 있을까???

진정한 내 삶의 원동력이 되기를 바라면서... 열심히 하겠다고 한번 다짐해 본다...
그동안 쌓아온 프로그래밍 및 설계 능력을 마음껏 뽐내보자~
Posted by Dansoonie