I have a server application and a client application. I\'m using socket to communicate between server and client. Everything works fine if there\'s only one client connect: uploading, downloading all
I have a bit of a design question if anyone at there cares to offer some advice (or point me in the right direction).
I have a DLL that runs the back-end and DB connections for a website that involves interactivity between individual sessions from seperate clients - 开发者_如何学运维i.e. chat etc.
I have a quite advanced application, where I need to add some client/server functionality. Some examples of functionalities is:
We have what I think is a fairly typical client/server architecture, with a frontend written in .NET, displaying data sent from a backend written in Java.
We have a multiclient system where the client is written in Flash and the server is written in Jav开发者_开发问答a. Currently, communication is done in Flash by usage of flash.net.Socket and the proto
I\'ll generate images in realtime on one computer. I want to stream this and have a few clients (on LAN, or other highspeed internet) read this stream.
I\'m making a client-server app in .NET, but now I got the problem that I try to send something, theclient is reading it and while reading I\'m sending something else so it makes crap of the sent data
I have created an automatic application updater for开发者_如何学运维 one of my applications.It is a two part application.The first piece is a version checker and update downloader.The second piece ins
My client connects to my Server very well locally (in a 开发者_Python百科LAN through a router) but when I try to connect my client to a server on a IP that is not in my LAN, it doesn\'t work. What cou