• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

How to time socket throughput - Resolved

 
Ranch Hand
Posts: 53
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
I created a client / server messaging system with Java taking subscribers and passing messages to the subscriber client sockets. The subscribers are Flex Mobile clients, the server is Java. When I run the app on my desktop simulating an Android or iOS device, all was good until the messages got too large. When that happened, I discovered that the bytes are not coming down all at once. For instance, I had a stream of 31,500 bytes and after my simulation client failed, I discovered that the bytes being sent were 31,500, but my client socket was reporting 3762 and then 16,060 and so on until it had reached 31,500.

To fix this, I did this in the flex code:


This code is in a Flex ProgressEvent listener. Once the ByteArray is fully constructed, I cast it to whatever it needs to be cast to (such as an Array) and then I reset the stream and set bytesIn = 0. This works fine on my desktop simulation, however on my Android and iOS test devices, I see the following happen:

1. The server sends the header message and the client sees that the total bytes is 192. Client reports ByteArray.length == 0.
2. The server sends the actual message and the client does not read it at all and so the ByteArray.length remains 0. I believe the subsequent message is being sent too quickly for the mobile device but my desktop is able to keep up.
3. The server sends a new message header stating that the total bytes are 61 for the next message the server is about to send (server does not yet know that client did not fully process the first message).
4. Client now reports that the ByteArray.length == 2 (which is because it missed the actual message in step 2 and thinks it's reading the actual message at this time but it is really reading the header bytes).
5. Client now reports that the ByteArray.length == 63 (which is the incorrect reading of 2 header bytes plus 61 message bytes and is due to the client thinking that 192 lost bytes are what it is receiving).

I believe my client is still processing the header message when the actual message is sent and so loses the actual message. What's the best strategy to solve this problem? Here is the server code.

 
Alvin Watkins
Ranch Hand
Posts: 53
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Since no one offered any ideas, here's what I did...

I created a MessageManager object which takes all client messages and stores them in Maps. MM sends to the client a header message if the MM.clientRecievedLastMessage is true. The client responds with the size of the message it is looking for (based on the header message). MM then sends the client the message that corresponds to the header. Once the client has received all bytes, it notifies MM. MM then removes the messages and if there are more unsent messages, sends the next header and the process repeats.

Thanks.
 
She still doesn't approve of my superhero lifestyle. Or this shameless plug:
a bit of art, as a gift, the permaculture playing cards
https://gardener-gift.com
reply
    Bookmark Topic Watch Topic
  • New Topic