Screen streaming

Posted By: EpsiloN

Screen streaming - 11/01/15 22:51

I've been searching for days, but all I can find is either too complicated for me, or it uses external software that works only as a VNC.

I want to stream my app to debian and use a raspberry as a handheld "monitor" of my laptop using sockets.

I've found (very few) examples that use D3D to access the backbuffer and write it to a file, but I have no idea if this is suitable to be streamed (not sure if it can be turned back into an image on the client) and also, I have no idea how to get the device context and stream this info using a socket (Theres only a D3D function to save the data as an image file, which is slower than sending that data directly).

I'm a newbie in D3D, but I'll get better soon.

Anything that can point me in the right direction will be appreciated laugh

Thanks.

PS.: I found this, an example by HeelX on grabbing the screen, and from what I can understand, "hBitmapdc" holds the content of the backbuffer for the desktop (lets assume I somehow point it towards the engine window grin ), how can I send it to Python socket client (what kind of data it is?) and will Python understand what it is, I mean, is it a bmp? Sorry if this sounds newbish, but I have no idea what this buffer holds grin
I found this for Python (wx) which creates an image from a buffer object (suppose a socket put that data in that object) and display it later as a streamed image in a window:
Quote:

CopyFromBuffer(self, data, format=BitmapBufferFormat_RGB, stride=-1)
Copy data from a buffer object to replace the bitmap pixel data.
Default format is plain RGB, but other formats are now supported as
well.




Here's the example from HeelX on grabbing the image:
Quote:

// Grabs the entire desktop. Make sure to set video_alpha to 0 before and to 100 after,
// to grab the desktop without the engine. Returns a BMAP* with the grabbed desktop.
//
BMAP* bmap_for_desktop ()
{
BMAP* bmap = NULL;

// get the device context of the entire desktop, including windows bar and everything
HDC hDesktopDC = GetWindowDC(HWND_DESKTOP);
if (hDesktopDC != 0)
{
// get desktop size in pixels
int desktopSizeX = GetSystemMetrics(SM_CXSCREEN);
int desktopSizeY = GetSystemMetrics(SM_CYSCREEN);

// creates a bitmap compatible with the desktop device context, in which we will later
// blit the desktop content into

HBITMAP hBitmap = CreateCompatibleBitmap(hDesktopDC, desktopSizeX, desktopSizeY);
if (hBitmap != NULL)
{
// create a memory device context compatible with the device context of the desktop. This
// context will be used to select the blit target
HDC hBitmapdc = CreateCompatibleDC(hDesktopDC);

// select the target bitmap into a the desktop-alike device context and copy the desktop
SelectObject(hBitmapdc, hBitmap);
BitBlt(hBitmapdc, 0, 0, desktopSizeX, desktopSizeY, hDesktopDC, 0, 0, SRCCOPY);

// create Gamestudio bitmap with size of the desktop
bmap = bmap_createblack(desktopSizeX, desktopSizeY, 888);
if (bmap != NULL)
{
// lock bitmap to blit from windows bitmap to Gamestudio bitmap
var format = bmap_lock(bmap, 0);
if (format > 0)
{
// we blit the bitmap pixelwise by fetching and converting the RGB components from the
// Windows bitmap and throwing it with pixel_to_bmap into the Gamestudio bitmap. This
// procedure might be faster by using GetDIBits and bmap->finalbits, but this seems to be
// safer for now

int iRow, iCol;
COLORREF colorRef;
COLOR color;
var pixel;

for (iRow = 0; iRow < bmap->height; iRow++)
{
for (iCol = 0; iCol < bmap->width; iCol++)
{
// retrieve the RGB color value as of the pixel at the specified coordinate as hexadecimal
// value 0x00bbggrr. We use the standard GDI macros to extract the component
colorRef = GetPixel(hBitmapdc, iCol, iRow);

if (colorRef != CLR_INVALID)
{
color.red = GetRValue(colorRef);
color.green = GetGValue(colorRef);
color.blue = GetBValue(colorRef);
}
// else: the pixel is outside of the current clipping region

// convert and set the retrieved color to a pixel in the format of the Gamestudio bitmap
pixel = pixel_for_vec(&color, 100, format);
pixel_to_bmap(bmap, iCol, iRow, pixel);
}
}

// blitting done, unlock Gamestudio bitmap
bmap_unlock(bmap);
}
}

// lets delete the desktop-alike device context
DeleteDC(hBitmapdc);
}

// an application must not -delete- a DC whose handle was obtained by calling the GetDC
// function. Instead, it must call the ReleaseDC function to free the DC. If such a DC is
// not freed, serious effects on painting requested by other applications can happen!
ReleaseDC(HWND_DESKTOP, hDesktopDC);
}

return(bmap);
}
Posted By: Quad

Re: Screen streaming - 11/02/15 07:46

any high performance game-streaming is done on gpu driver level and VNC is in no way suitable for game streaming.
Posted By: Anonymous

Re: Screen streaming - 11/02/15 08:37

<---Swimming over his head.
Click to reveal..
Could you pull the dx 9 back buffer, send it as raw , pulling it back in , then feed it to the new machines dc?
Quote:
use IDirect3DDevice9::GetBackBuffer() to obtain the back buffer surface
- use IDirect3DSurface9::GetDC() to obtain the device context.

EDIT- MORE fun with Google http://gamedev.stackexchange.com/questio...without-using-d
Yup, sorry. Hbitmap- is a Windows system struct of some type to hold a bit .. hbitmapdc- seems to be hbitmap Device Context.
Quote:
hBitmapDC: Handle to a device context of the image .......
d.
-form Microsoft.

Google was fun, but yup - have no really answers.
Posted By: EpsiloN

Re: Screen streaming - 11/02/15 10:05

That's what I'm after. Not a VNC but a custom app that'll stream my game exclusively.

I'll have to read more on device context and turning it into a regular bitmap, I guess.

I plan on using prediction for the stream, instead of compression, and I'll try to use Google's WEBP for this...So it becomes a full stream (possibly more than 30 fps) on LAN.

Thanks for the heads up Malice laugh
Posted By: EpsiloN

Re: Screen streaming - 11/02/15 13:27

Ok, my quest continues laugh

It appears bitmap is a bitmap, there's only a difference in the order of storage as bits...(top to bottom or bottom to top).

After more reading, on Microsoft, I found out that a bitmap is represented by COLORREF objects representing 3 8-bit integers for the color and one unused 8-bit before them in hexadecimal format, that HeelX demonstrates how to disassemble into rgb values (0-255) with Get macros to a var.

I could pack that into a struct and send it through a socket to python, which could be assembled into a wxPython type and displayed on screen:
Quote:
wxBitmap is intended to be a wrapper of whatever is the native image format that is quickest/easiest to draw to a DC or to be the target of the drawing operations performed on a wxMemoryDC. By splitting the responsibilities between wxImage/wxBitmap like this then it's easier to use generic code shared by all platforms and image types for generic operations and platform specific code where performance or compatibility is needed.

Afaik wxImage is data representing a platform independent image. It contains a class RGBValue:
Quote:
class RGBValue
A simple class which stores red, green and blue values as 8 bit unsigned integers in the range of 0-255.

So, I now only have to test it somehow, without having my raspbian booted up grin and convert somehow the data into an wxImage to be displayed by a DC.

If anyone has any ideas, feel free to express yourself grin
Posted By: WretchedSid

Re: Screen streaming - 11/02/15 15:32

I hate to be a buzzkill, but this task is way more complicated than you might think it is. For starters, you don't want to actually bitmaps, as they are large and will clog up your network very very fast. Instead, you probably want to live encode it into a video and stream that, as those provide adaptive quality, if needed, but can also cope with package loss much much better. For a quick and dirty start, anything keyframe based will probably work, although h.264 is probably the best since there is plenty of documentation and code for it around.

That is not to say that you still have to grab your screen, because you have to, but you will probably want to dump the result directly into something like ffmpeg to get the video encoding going, and then send the resulting frame via a socket and then assemble the video back together on the other end. And don't even bother with converting the backbuffer to a BMAP or anything, you want raw access and then do as little as possible with it to avoid the penalty of converting the result around. So ideally your backbuffer is already in a format that is directly supported by ffmpeg and preferably you have a way to get it asynchronously without blocking the rendering (so probably triple buffer everything or so).

The receiving side on the other hand is much simpler, although I'm not sure if the Raspberry Pi has enough power to deal with the video decoding. Probably not, but I think there you can get some hardware decoding action if you pay for a license. But yeah, pretty much you just wait for a keyframe to come along and then start assembling the video frame by frame.
Posted By: EpsiloN

Re: Screen streaming - 11/02/15 18:23

I too gave that a thought and tried to do some calculations.

If I use the raw data in the buffer, it'll be around 1mb/f so 60mb/s in 800x420 24bit.

But, I intend to compress it somehow, although I never thought about video encoding.
I don't want to convert the raw data of the backbuffer, because this will slow the process, I want to just compress it and send it to be decompressed in raw data and then the raspberry will have to use it as is (device independent...)
That way, the raspberry will only need to decompress, not also convert or resize or anything...

I was thinking something like jpeg and then using the WEBP format, or directly using the WEBP (I didn't see any working examples, but Google did explain how the process works, so I thought I could replicate it in the future, after I get a few frames going.)

I'm not sure if the raspberry could handle any kind of decompression if its 60fps, but I'll experiment if it fails.

I'll give a look at h.264 later today laugh I always download that kind of movies, but never thought about using this grin hah, irony.
Posted By: WretchedSid

Re: Screen streaming - 11/02/15 18:32

You still need to encode and decode the jpeg, and that is quite costly. A jpeg really doesn't give you anything over a raw buffer in terms of how fast you can push it to the GPU to get it rendered, or how fast you can get it from the GPU to the socket, because in both cases you have to transform the image.

Really, use h.264 or any other video format which uses keyframes, as that will drastically improve quality over jpeg and decrease bandwidth need. Plus, your jpeg is not immune to packet loss, whereas h.264 is (although with reduced quality as a result).
Posted By: EpsiloN

Re: Screen streaming - 11/02/15 18:43

Don't worry, you've convinced me to use a video encoding laugh I don't have a clue how it works, but I know it'll work well because its a stream and pixel perfection isn't needed, image formats are better suited for a still image, right?

Ok, but what do you mean I don't want bitmaps, exactly? Sorry, but I'm new to the D3D stuff laugh
I can still safely use GetPixel(hdc,col,row) to access the raw data and pass it pixel by pixel to be encoded, or is there some sort of array that holds all this data?

PS.: And, another quick question, haven't asked google yet, but how do I get the device context that the Acknex uses? Didn't find any pointers in the manual... Sorry if this is a stupid question.
Posted By: WretchedSid

Re: Screen streaming - 11/03/15 10:56

Originally Posted By: EpsiloN
I know it'll work well because its a stream and pixel perfection isn't needed, image formats are better suited for a still image, right?

That's right. However, one important thing is that for example jpeg is not pixel perfect but instead applies destructive compression on the image, ie you end up artifacts in the image that you can't get out.


Originally Posted By: EpsiloN
Ok, but what do you mean I don't want bitmaps, exactly? Sorry, but I'm new to the D3D stuff laugh
I can still safely use GetPixel(hdc,col,row) to access the raw data and pass it pixel by pixel to be encoded, or is there some sort of array that holds all this data?

Pixel by pixel is way too slow, you want to get a raw buffer of the data, preferably in a format that is already supported by ffmpeg or whatever encoding library you end up using. The less operations there are between grabbing the data and getting it to the encoder the better, so ideally you want to do no work at all here and just copy the data in a suitable buffer (or even better, get a suitable buffer provided by whatever!) and then send that off to the encoder.

Originally Posted By: EpsiloN
PS.: And, another quick question, haven't asked google yet, but how do I get the device context that the Acknex uses? Didn't find any pointers in the manual... Sorry if this is a stupid question.

No freaking idea. Not a stupid question at all, but I'm not the right one to answer it. Someone else will probably be able to help you out there.
Posted By: EpsiloN

Re: Screen streaming - 11/03/15 12:24

I have no idea what I'm supposed to do now, but I cant give up in the middle laugh

Two days Search on H264 gave me only more confusion. I tried to find a library that would work something like encode(buffer) decode(buffer) laugh but I guess if it was that easy I wouldn't have to look for it...

I finaly found a reference to this tutorial:
http://dranger.com/ffmpeg/tutorial01.html
on ffmpeg and I'll try to read it and follow closely to see if this can grab an array of COLORREF dwords (assuming here that the buffer is a big array, if not, I found a snippet by someone, who says his way is faster than the GetPixel macro, here:
http://stackoverflow.com/questions/10515646/get-pixel-color-fastest-way
and build an array myself to pass as data to be encoded).

PS.: Each time I look at something, your words come to my mind: "Know your data types" grin I know I should read a DX tutorial or two before going any further...

And, by the way, just for reference, I did find something that streams the buffer, but I have no idea how it works (couldn't dissect it...) and its in C#, here:
https://github.com/fishjord/D3DFrameStreamer/tree/master/AVIStreamCLI
It also uses injection, but I guess this can be stripped somehow, if anyone want to stream his own app...
Posted By: WretchedSid

Re: Screen streaming - 11/03/15 16:25

ffmpeg is just the overall project, probably most interesting to you is libavcodec which implements encoders and decoders. Here is an example: https://www.ffmpeg.org/doxygen/0.6/api-example_8c-source.html

And there is more to find on the internet.
Posted By: Anonymous

Re: Screen streaming - 11/03/15 17:56

In my link run the other day I found people - locking the d3d backbuffer and streaming it by H264 . - Noting the locking to be a bottle neck and also noting that there is no way in dx9 to grab the front buffer.

I can try and grab more of those links with examples if you like. However, I don't want to add more confusion and a different path to your already deep woods.

Quote:
PS.: And, another quick question, haven't asked google yet, but how do I get the device context that the Acknex uses? Didn't find any pointers in the manual... Sorry if this is a stupid question.

Manual of note
http://www.conitec.net/beta/adraw_begin.htm
http://www.conitec.net/beta/ad3d_lockable.htm
http://www.conitec.net/beta/prog_using_Direct3D.htm
http://www.conitec.net/beta/bmap_zbuffer.htm
http://www.conitec.net/beta/render_zbuffer.htm
Quote:
The LPDIRECT3DSURFACE9 pointer to the created z buffer is available through the render_zbuffer variable.

LPDIRECT3DSURFACE9 and IDIRECT3DSURFACE9 seem analog.So I'm not sure if this is
LPDirect3DDevice9->GetBackBuffer() to obtain the back buffer surface
- use LPDirect3DSurface9->GetDC() to obtain the device context.

None dx9
https://msdn.microsoft.com/en-us/library/windows/desktop/dd144947(v=vs.85).aspx
http://www.conitec.net/beta/hWnd.htm

EDIT2 - and just thinking -is possible? create bmap_zbuffer, render screen to it. Then stream that.
Posted By: EpsiloN

Re: Screen streaming - 11/03/15 19:31

Yeah, after my last post I saw that ffmpeg is the encoder that can encode in different codecs (someone was using it to encode in H264) laugh

Malice, thanks for the info, I'll check the links later, because I woke up extremely ill today, and I might be afk for a few days. But for the locking, I saw some people using locking in some codes, but never stayed for too much. I also saw from posts that grabbing the front buffer is a waste of time, literally grin

Here's what I've currently outlined, as a method (purely Theoretical, haven't written a single line of code yet laugh ):
Get Device Context
Create myDC
CreateBIPSection
Select object BIPSection for myDC
each frame:
- BitBlt content of DC to the BIP section (now I have an array in the order [r][g][b][r][g][b][r][g][b]...
(Note - this is said to be faster than GetPixel)
** I've reached this point in reading, from now on everything is just an outline of thoughts laugh **
- Create a buffer size to hold a matching res YCbCr array and use a conversion function to convert the RGB array into YCbCr (which I somewhere read is kind of slow, but maby there are different implementations)
- Create AVPicture and fill it with YCbCr array (avpicture_fill)
- Copy AVPicture to a fresh AVFrame and pass that frame to the encoder

I hope I'm right so far, but it'll all show up after I begin testing in a few days, when I get better.
I'll first start by creating a video file on my HDD, if succeed, I'll try to send the data and create a file on the raspberry, trying to avoid showing the image yet, just playing the result on my laptop (I'll worry about decoding it on the raspy after I get it sending and receiving correctly laugh )

Thanks for the help so far, guys. I'm glad you're still lingering here...

PS.: Malice, you're suggesting to create a bmap_zbuffer, but isn't bitmap creation slow? I also don't have a clue how render targets work. Used them a lot of years ago, and I have no memory of that process.
Posted By: Anonymous

Re: Screen streaming - 11/03/15 20:17

Quote:
PS.: Malice, you're suggesting to create a bmap_zbuffer, but isn't bitmap creation slow? I also don't have a clue how render targets work. Used them a lot of years ago, and I have no memory of that process.


Truely I'm just throwing out ideas. But If I understand render z buffer is a direct d3d surface with pointer to the d3d object.

As for using this kind of idea, txesmi did a lot of working it out in a thread in this very topic a week or so ago. So striping the code he posted to what you need 'could' be half your work. LINK HERE

As to speed or if anything I say will work, I'm just spite-firing idea's. I have no clue. I'll try to do some testing, however I'll likely fail as I'm miles behind you in understanding all this.

Mal

EDIT- This didn't crash, however examining the d3d9.h and looking to transfer this http://gamedev.stackexchange.com/questio...without-using-d
Makes my mind explode... I want to help but I am way over my head here. I have no idea what is happening at GetDC() and it might melt your pc. This stuff is to high level for me to just trail and error throw. Sorry
Mal
Quote:

#include <acknex.h>
#include <d3d9.h>
void main()
{
.............................
BMAP *bmpMap = bmap_createblack(1920,1200,32);
BMAP *bmpScreen = bmap_createblack ( screen_size.x, screen_size.y, 24 );
bmap_zbuffer(bmap_createblack(2048,2048,32));

while(1)
{
bmap_rendertarget(bmpMap,0,0);
GetDC(render_zbuffer);
bmap_rendertarget(NULL,0,0);
wait(1);
}
}
Posted By: EpsiloN

Re: Screen streaming - 11/03/15 23:03

Lol, wrote a really long boring post grin

Gotta go to bed, so in short, DC is a structure that contains other stuff, like the bitmap data being displayed in our Acknex window, so GetDC will work on a handle (a Windows handle) of the window and will return a handle to the device context (structure of data about our window). Then, we know the space in memory that holds what is being drawn on our window... With some core functions we can get access to the bits by giving the device context handle to them...from what I understood...

By the way, someone reported bitblt function (image copy in memory) took more than 30ms each time, so 60fps wont cut it. Another guy mentions LockBits being faster and with manual control, but I have to see how I can use it on the buffer (he's giving a Bitmap structure as an example...)

Gotta go, really laugh
Posted By: WretchedSid

Re: Screen streaming - 11/04/15 19:26

The z-buffer is the buffer with the depth in it, you probably actually want to grab the colour buffer! Just an FYI, because I really can't help you with actually grabbing the DirectX pointers and what to do with them.
Posted By: Ch40zzC0d3r

Re: Screen streaming - 11/04/15 19:52

Code:
IDirect3DSurface9 *pSurface;
LPD3DXBUFFER pBuffer;
pDevice->CreateOffscreenPlainSurface(screen_size.x, screen_size.y, D3DFMT_A8R8G8B8, D3DPOOL_SCRATCH, &pSurface, NULL);
pDevice->GetFrontBufferData(0, pSurface);
D3DXSaveSurfaceToFileInMemory(&pBuffer, D3DXIFF_JPG, pSurface, NULL, NULL);
pSurface->Release();

//Now have fun with:
//pBuffer->GetBufferSize();
//pBuffer->GetBufferPointer();



And then do all your compression stuff. (Its saved as JPG, you might use BMP/PNG if you wanna use another compression)
This is still slower then coding your own device driver like Teamviewer did.
Posted By: EpsiloN

Re: Screen streaming - 11/05/15 13:33

Ch40zzC0d3r , everyone says that getting the FrontBuffer is slow. I'm browsing for other methods.

Searching again on google gave another result, direct quote:
Quote:

I'm trying several methods to capture screen with DirectX and I've come up with 3 methods so far:
•GetFrontBufferData() - Average execution times: •GetFrontBufferData(): 0.83598 s
•D3DXSaveSurfaceToFile(): 0.0036 s
•Total: 0.83958 s

•GetBackBuffer() - Average execution times: •GetBackBuffer(): 0 s <-- INTERESTING - WHY?
•D3DXSaveSurfaceToFile(): 0.2918 s
•Total: 0.2918 s

•GetRenderTargetData() - Average execution times: •GetRenderTargetData(): 0.00928 s
•D3DXSaveSurfaceToFile(): 0.00354 s
•Total: 0.01282 s


Average times have been computed by taking 50 screenshots and by measuring time with clock().

A quick glimpse on GetRenderTargetData - Copies the render-target data from device memory to system memory.

I have to read more on what a "surface" is grin

Another note, I found some interesting examples on encoding/decoding, but they output to a file stream, so I'll have to learn how to make this data go through a socket. I guess it wont be that much different...
Here's the examples, if anyone is interested in making a video player laugh :
https://ffmpeg.org/doxygen/trunk/encoding-example_8c-source.html
Posted By: Ch40zzC0d3r

Re: Screen streaming - 11/05/15 19:31

Well where is the problem now?
I just googled 2 seconds to throw in some code, I know that its not the fastest.
Just use GetRenderTargetData and your done.
Posted By: EpsiloN

Re: Screen streaming - 11/05/15 22:22

Oh, I got past that point (accessing the screen). I'm now on the 'send 160mbps over lan' problem. laugh I'm trying to learn how to encode/decode the data, and I think I got it today, so tommorow is the time I'm gonna start coding. I was a little busy (ill, and got a new car...).

Also, I remember someone mentioning that conversion is usually slow, from RGB to YCbCr, so, that might give a problem too, but I'll report after first tests.

I just hope everything works out as in theory tommorow, if my daily job doesn't get in the way...

Until then, any guidelines for sending data from a buffer from Life-C to Debian
Phyton?

Thank you all, again. I'm extremely excited about this project.
Posted By: Ch40zzC0d3r

Re: Screen streaming - 11/05/15 22:54

Just use winsock and dont use TCP.
Simply use UDP and compress the pics as JPEG and you will have some nice speed for the start.
If your protocol is stable enough you can try to play with multithreading and using different libraries/methods to shrink your byte array as much as possible. Do it like twitch and add a delay which is usually used for compression time compensation.

How many FPS do you want to stream btw?
Posted By: EpsiloN

Re: Screen streaming - 11/08/15 08:56

Well, I'd prefer something over 30, because it looks laggish to me when the engine drops to 30. But if it looks like a movie, on the other hand, from what I know movies play at 25fps, but it looks smooth...

I'm extremely ill, so I cant think straight, but I tried a couple of times and the only method that worked right off the shelf was some code from Microsoft that captures a screenshot into a bmp file. (and by the way, capturing 600 of those each time into the same bmp file didn't drop my frame rate at all, but my level is extremely simple without any shaders or effects laugh )

Before going any further, I'll try making a socket connection to the raspy and try to display a single frame every second (the conversion is bothering me, although it should be in the same format, eg. bits, data from my laptop's backbuffer to a dc in wxpython on the raspy). After I have something going, I'll try compressing and sending more fps. laugh
Posted By: EpsiloN

Re: Screen streaming - 11/11/15 12:28

In need of help again laugh

I managed to get a socket connection between Acknex and a Python script. UDP didn't work, for some reason...

I successfully ran an example from some Microsoft page that captures the backbuffer into a bmp file at 60fps with no fps drop...I added the structs for BITMAP, HBITMAP, HBITMAPINFO and a lot of other structs to bypass the errors I was getting and it worked.

Now the problem...
I tried using the FFMPEG library and JPEG library (with examples I found), but its failing to compile...
After 100's of missing files errors I downloaded the libstdc++-v3 and the standard C library and tried adding files from wherever I find them (multiple different files with the same name existed in different folders) but I started getting this error:
Quote:
...
Error in 'features.h' line 173:
< #if (!defined __STRICT_ANSI__ && !defined _ISOC99_SOURCE && \
>
.. 0.436 sec
Error compiling...

That's after I deleted a whole statement like this and deleting spaces between # and ifdef statements...

I feel like going mad! :|

I understand those libraries come by default with IDEs, so its normal not to have them with our SED, right? How can I add them, or something?

Thanks.
Posted By: WretchedSid

Re: Screen streaming - 11/12/15 20:41

Originally Posted By: EpsiloN
I understand those libraries come by default with IDEs, so its normal not to have them with our SED, right? How can I add them, or something?


IDEs very rarely ship with anything more than a handful of default and compiler/platform dependent libraries. I'm not aware of an IDE shipping with ffmpeg.

You won't get ffmpeg to compile with the Lite-C compiler, because it simply is not capable of that. You'll need a real compiler to get that going, so a plugin is what you will have to go for.
Posted By: EpsiloN

Re: Screen streaming - 11/13/15 08:08

Thought so laugh

I already tried to compile the screenshot function I got from Microsoft into a dll, but I ran into a problem (no wait(1) in C++). And I don't understand the syntax that much, so I'll be reading for a few days.

I'll try to make 3 functions, one for locking, one for copying the bits and one for unlocking, and I'll call the copying every frame.

I'll post when I have any results...
Thanks.
© 2024 lite-C Forums