Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Discussions

androidi androidi
  • What's the dwm API equivalent to audio loopback capture?

    I gave this some thought. While there's certainly also a compression issue, the bigger factors are likely:

    1) maybe my ISPs youtube CDN doesn't have the full bitrate content on it yet

    2) maybe some "producers" are doing shoddy captures of compressed content (perhaps of new content that was not at full quality on their CDN), then recompressing it and then youtube does its own compression and whatnot and then another producer captures that and so on.

    3) as I'm viewing a fair amount of very recently uploaded videos (within a day or week), in those cases. especially the higher bitrate the source, it would make sense that youtube has so much uploads that there could be prioritization going such that certain partners content get encoded at high quality at various resolutions quicker etc. And as I watch just uploaded stuff from random channels I'm getting the hardest to predict and thus worst case.

  • What's the dwm API equivalent to audio loopback capture?

    I don't think that would be fair. If you wanted to say compare various services and their players, it would need to use the output from the player (flash/html5).

    YouTube is now bragging about 8K video resolution support but anyone with atleast one eye can see that the compression is so heavy handed that there's massive artifacts that the high resolution just makes even more obvious.

    As for the analysis, I just figured that it should be enough to do some form of lossless compression of frames and see how much is the resulting size. If the video is some sort of slow panning view, the lossless compression should reveal the actual amount of data in the frame. Pundits in youtube comments already argue that vimeos fullhd looks better than youtubes 8K. With such analysis this Youtube race "bigger res is better" would get its appropriate conclusion. It would be funny to prove that bigger res isn't better (when there's heavy compression involved, or just plain lack of production value in the source). This same approach could be used to analyze games and drivers which seem to commonly optimize the graphics into awful, such that to restore any resemblance of the original quality you now have to render them at 2-4x res as the rendering at native resolutions have been optimized over the years to subtly look worse. If it was not, then there should not be so massive difference between rendering at 2-4x and downsampling vs rendering at native.

     

  • What's the dwm API equivalent to audio loopback capture?

    I'm noticing that youtube videos, when movement is present, appear to have detail level of maybe 25% of the detail when there is no movement. Not all of them but many recently uploaded ones. I'm not sure about the cause of this. The result is that most videos recently look like blurry mush.

    But in order to quantify this problem, I need to be able to select a portion of the desktop or whole screen and then use "wasapi audio loopback capture" equivalent for DWM to grab the lossless frames from the youtube window to analyze the amount detail in each frame, such that I could then overlay statistics on top of the html5 or flash video that shows how much detail there is.

    Because I think while youtube is claiming the video to be 1080p or whatever, in a lot of recently uploaded videos the detail is actually less than 500K pixels. I'm also pretty certain that some old videos have degraded in quality - the ones that were compressed using some first version of the Flash codec.

    Further I've found several reports that some very old Flash versions have better video quality on youtube than using the latest version. (this was year or two ago, I couldn't get a good repro of this when I tried downgrading)

     

     

  • There is no escape from the suck-train!

    http://windows.microsoft.com/en-gb/windows/preview-faq-system-requirements-pc

    • Processor: 1 gigahertz (GHz) or faster

    • RAM: 1 gigabyte (GB) (32-bit) or 2 GB (64-bit)

     

    That's sort of interesting. There's no mention of "64 bit only". I found some were hacking Windows 8 to get it to install on some older 32 bit cpu's like the early Pentium M's. I wonder if 10 now works on those old systems and 8 won't or are these system requirements not being specific enough?

    Until there's some serious alternatives to old systems with decent aspect ratio, I'll try to avoid buying stuff if I can. Only looking at the SP4 14" (assuming it's 3:2 still) and maybe the XPS 18 if I could get my hands on it somehow as I don't like buying displays without seeing them first.

  • Does the new IE in Win 10 still 'leak'?

    Well I took a bit of time to learn to use the F12 Memory tab in IE11. It would indicate there is some sort of loop going on in Google's JS code doing allocations and this code is loaded on various sites. (I didn't look if the problem is in the google code or if the site is using it in a loop - but two different sites with same exact issue)

     

    alloallo

     

    https://apis.google.com/_/scs/apps-static/_/js/k=oz.gapi.en.ySt90RvrVF8.O/m=auth/exm=plusone/rt=j/sv=1/d=1/ed=1/am=IQ/rs=.....

    I don't know if this  rs=... parameter is some tracking info or what so I blanked it. Appears to be same thing on both sites.

  • Does the new IE in Win 10 still 'leak'?

    I haven't quite pinpointed the page cause I tend to have different pages open in different tabs and I have no idea how to identify which tab is causing the growth. I have some candidates though so when it happens again and again I can slowly narrow down. They're all something to do with shopping so no doubt there's all sort of JS on them.

    I'll leave some of the sites open in separately open processes to see if it happens again...

  • Does the new IE in Win 10 still 'leak'?

    This is after having had a bundle of IE11 9600.17691 tabs open sitting on same* page for ~20 hours. As usual, there's some things in my configuration that may trigger it. So I'll just get 16 GB of memory from now on, so I can keep the tabs open atleast for... 40 hours.

    900 MB IE11 instance

    * by same I mean having the *same* page open without activity. The url/site may be different. A look at the loaded DLL's in process explorer shows that 2 out of the 5 biggest leakers have Flash and other 2 only have Microsoft dll's beside the Nvidia user mode driver. I think I saw this issue on a computer without Nvidia so I'm pretty confident the issue is in IE.

  • Before Surface Pro 4

    Besides the reasons mentioned above, this could be another. This is "funny".

    http://rafesagarin.com/2014/06/18/why-dell-sucks/

  • Before Surface Pro 4

    While waiting for Surface 4 14", I just found about Dell's XPS 18". Kind of interesting. At that size, even a wide AR screen is usable though being so wide it's not so portable anymore + the excess width adds weight too. Also there's the charging issue - as long as USB 3.1 isn't widespread, charging pretty much means need to haul around the charger, which pretty much removes the whole portability aspect. I don't get why desktop PC's weren't standardized with 12V external power ports. (On my desktops I have added both molex and sata power ports in the PCI backplates so I can use internal storage externally without high overhead of USB) If they were, you could just walk to any place and charge your laptop from any desktop PC's pure power port.

    Coming to think about this, I think desktop ATX power supply standard needs to be changed to require that there is a sata power port next to the power supply input connector...

     

    I'd be tempted to go look at the Dell except that Dell has made it hard to buy from (poor availability + 70% higher prices in the countries where Dell is probably more big corp/govt.focused).

  • VS vNextNext : Variable snapshot and "Edit and Commit"

    @bondsbw: That's something I've wanted on another project. The ability to add new trace stuff at runtime "add logging statements on-the-fly" - if that really works without pausing the app, could also be useful with the "edit and commit".

    ---

    With the "edit and commit" - edit without pausing the app, it's useful to get a coarse idea of what is going in variables. This is really no different than if you were debugging some sort of analog circuit with a digital scope. The circuit(or app) goes into state where it's doing stuff and you get a snapshot (poke your probe around the board with single-shot  triggering on the scope) of the data. eg. easiest to understand example would be if you had a half-broken audio synthesizer hardware. You would not power it off to fix or tweak the sound but place ability to make adjustments (like tiny internal trim pots) while running*. And by "broken" I mean that perhaps the audio did not sound nice, maybe there was a repetitive glitch only after the device was in certain state etc.

    Now I'm willing to leave it open how should this runtime editability be implemented - so far I've suggested you might just have the ability in something like the CLR but another way to do it is to have a C# console app template that creates all the boilerplate needed to have 1 app running and another editable, then you can "failover" the running app to the edited version and these could be located on different computers. I imagine this would be somewhat similar to cloud development but without needing cloud. Because lets face it, if I'm doing a game engine or driver or whatever, do I want cloud? Not really... But I still need ability to tweak things at runtime and I may also need ability to failover things such that the production executable is on embedded hw or simulator and the 'being edited' version is on the VS side- so then I just failover the stuff on the hw-sim to the edited version such that instead of 1 second cross platform compile times it will be <1 millisecond operation.

    So there's few ways to architect this and once you have it, you want ability to also monitor the variables at runtime, just like if you had n+ channel logic analyzer sitting on all the bus on the hardware waiting for a condition to occur so you can get short view of what went on just before and after the condition/glitch.

     

     (* with hardware you can actually pull and add stuff into the circuit without powering it off if you understand what kind of changes can be done without risking component failure - and of course the more failure prone things can be either designed or dealt with specific order)