Wednesday, February 10, 2010

Verizon FiOS Remote Codes for Xbox 360

I found this to be useful. I didn't feel like shelling out a bunch of dough for the Microsoft brand remote when I already had the universal remote that came with my Verizon FiOS. They use a Phillips brand RC144 5302 remote (I've read the older one is 5301 and this code might work there as well). You can find the model number on a sticker in the battery compartment. Anyway, the remote code doc says to use code 0549 on the DVD setting. While my remote happily accepts that code on that device, it does not appear to control my Xbox 360 at all. Maybe that's for older Xbox 360 revision as my unit is only a year old as of this writing.

So I went to the user manual at http://onlinehelp.verizon.net/consumer/bin/pdf/Philips RC1445302 User Guide.pdf and noticed a peculiar device under Video-VCR that was branded Microsoft. I didn't know they made VCRs. I also saw Media Center PC under the Video - DVD section. They both had code 1999. In fact there was at least one other vendor with that code at a quick glance. So I programmed the remote:
  1. Hold down the DVD button for 2 seconds
  2. While still holding the DVD button, press OK. All the device lights should light up.
  3. Enter 1-9-9-9 on the number pad. The DVD light should blink after a moment. If all the devices light up once for a longer time, then it did not accept the code. I don't know what to tell you other than try again in case you fat fingered it, verify the remote control model on the label, or try that 0549 code which didn't work for me.
  4. Test it out!
Notes:
  • This may work on the AUX device slot as well, I did not try it. However, I do know that some codes are not accepted on some device slots (that 0549 code entered correctly on DVD but not AUX, and it didn't work anyway but that's besides the point).
  • The universal remote control's Power button turns on/off the xbox immediately. That is, there is no menu as when you hold the big Xbox Logo button on a wireless controller.
  • The arrow buttons map to the D-Pad (or perhaps the stick) and function intuitively.
  • The OK button seems to map to the Xbox A button, or at least the Xbox dashboard handles it in an intuitive fashion. I am not sure what the correct button is that maps to Xbox B button but Menu seems to kick you out of a movie you are viewing and puts you back at the Netflix menu.
  • The Info button toggles the time display overlay at the top of the screen on Netflix.
  • I have not tested the color buttons (the ones that Blu-ray devices seem to use) to see if they map to other Xbox buttons, but that would be icing since they are not really needed to watch movies.
  • If you find any other useful buttons feel free to post them in the comments. Also if you have another model remote control from Verizon and found another code or that this code works too, please share.
Enjoy!

Tuesday, January 19, 2010

Why .NET Heralds the End of Intelligent Software Developers

These are sentiments I have felt for quite sometime, but have never written down, at least not in such length or breadth.

I believe that exclusive use of .NET is really causing the dumbing down of programmers. It's astounding how few people recently finishing their undergraduate degree know nothing at all about what is going on inside a computer or how their poor code or sloppy algorithms are thrashing the TLB, or D-cache, or harddisk or even have the slightest clue how to implement some code ouside of .NET. Don't even bother them with any kind of pointer math, algorithms, or data structures. I also feel that Microsoft's flagship language for the CLR, C#, is just a copy of Java that came 5 or 6 years later. Just compare the language rules, package framework, garbage collection, dynamic recompilation and bytecode (not exactly unique to Java either), and syntax of C# 1.0 and Java. It was basically Java with C++'s ":" operator for inheritance instead of Java's extends/implements keywords. Also, Generics were added in C# 2.0 which are just a rip-off, both syntactically and conceptually, of C++ templates except not as powerful (template methods and partial specialization not supported in 2.0 as far as I know for starters) and again 15+ years late to the game.

What do you get for all of the complexity? The most common pro I hear for C# v. C++ is garbage collection... if that's what you're looking for use RAII and stack allocated wrappers to clean up for you (e.g. auto_ptr, boost::shared_ptr, etc...). Have you ever seen a correctly written C# or for that matter Java loop exhaust the memory? Do you hate inserting artificial scopes (is that even possible in C#) or slow function calls in a loop to work around these issues? Don't you wish you could manage the allocation yourself sometimes? Too bad. The second most common is that you don't end up with NULL pointer dereferences... well you still end up with a similar number of "reference not set to instance of an object" errors which are exactly congruent and harder to guard against. In fact it's probably even worse because you cant check for equality with NULL first before dereferencing, except for the rarely-used "nullable types" which can then have the NULL pointer dereference issue again.

To draw an analogy, the creation of C# is kind of like the automatic transmission: most real auto racing is still done with some variant of a clutched manual transmission (except for some drag racing where shifting has to be really fast). Also I find that driving a manual transmission makes me more attentive, have more control over the vehicle, have a safer and easier merge onto the freeway (since I can choose when my transmission downshifts), and have a better chance and more responsiveness in the case of a crisis. In fact, if I had access to some accident statistics, we might find that drivers of manual transmission cars have fewer at-fault accidents per-capita (just speculating here, but would not be surprised). Thus, it's my personal experience that many more people knew at least something about cars and how they work in the past, versus now. In fact, if that manual shifting behavior was undesirable, you wouldn't see so many Manumatic systems in AT cars. You can argue that the car is "more accessible" now, but I argue that it is less flexible and it's users more lethargic or ignorant. That might boost sales, but I wager there would be few people still riding a horse if there were only MTs or if ATs were a special order options for those with a physical handicap preventing the use of MTs. Sorry to take so long to draw a parallel, but I think this is directly analogous to .NET. It's "a way" to do things, but it should not be "the way", is seldom if ever "the right way", and often is not even "a good way".

If you think that my disdain for .NET is an indication that I am some bitter old engineer from the '60s or '70s, I am actually in my 20s and thankfully just missed the fine line between when they stopped teaching "real" programming in school. And by the way, I used to work at Microsoft (full disclosure) and I left of my own accord, but not because of .NET :)

I originally posted this here, but I thought that I liked the response enough to post it on my personal blog.

Tuesday, October 13, 2009

Crash Windows Vista and Later Remotely With Non-authenticated User

Last month, I was looking into the reports of this flaw and I had written up a proof of concept in C/C++ on Windows. It seems pretty straightforward and I was able to crash Windows 7 Beta and RC. Due to the lateness of the exploits discovery, I would not be surprised if the exploit works on the Windows 7 RTM bits, or the next Windows Server bits. I don't have much time to explain the defect in detail, but suffice it to say that it relies on poor or just plain unvalidated SMB negotiation data in the srv2.sys driver. As most Windows driver authors know, crashes in kernel mode equals BSOD.

For more information, see the following links:
For convenience (for most people), today I took the time to wrap the source code proof of concept in a VS 2008 project to make building the binary much easier. For people using some other build system, the only "non-default" lib you have to use is Ws2_32.lib, the Windows Sockets lib. For folks using GCC on Linux, sorry I did not have time to make this cross-platform or portable. It shouldn't be too hard for you to strip out the WSA* windows-specific sockets stuff and just use standard BSD-style sockets calls. There is also some address-resolution code that is probably MS-specific. In any event, the code is below. It's really just a single source file (though Visual Studio is good at adding a lot of crap to the project, I have tried to minimize it by not using PCH, etc...)

Here is a link to the project (includes binaries).

Notes
  • DO NOT use this tool against someone's computer! This may be a violation of law in your jurisdiction. Please use this for academic purposes only, for rebooting your own machines, for generating crash dumps for novice investigation, or for generating crash dumps when debugging other drivers or system apps when you don't have an alternative method.
  • The usage is pretty simple. From a command prompt just run "crash_remotely 69.69.69.69" where 69.69.69.69 is the remote machine address.
  • The project builds, as configured, to use the MSVCRT. If you are not using VC 2008, you will need the MSVCRT 9 runtime.
  • The target machine must be Windows Vista or later running the srv2.sys SMB/CIFS networking share driver, and have at least one active share (basically the driver must be loaded and processing connection requests). You do not need share access, which indicates the severity and exploitability of this bug.
  • It is not clear when/if this bug will be addressed by Microsoft. If the tool does not work for you, tough luck.

Sunday, May 3, 2009

Fannie Mae/Freddie Mac

Discriminatory Lending
First of all, this post is way overdue. I should have posted something on this topic 6-8 months ago. In fact the evidence in these videos was taken from sources spanning the past decade. I have made claims in the past that Fannie Mae and Freddie Mac issued loans to very unqualified borrowers. I am using the term "Discriminatory Lending" but probably in a way that most people do, I am talking about what is often termed "reverse racism", which is a stupid term because it is just plain old vanilla racism, but this is the legacy jargon we have to use. In 1977 the Community Reinvestment Act (CRA) was passed under Pres. Carter. In the late '90s (Clinton administration), there was push to relax loan regulations even further so that previously non-qualified borrowers would be issued loans. Why the Democratic Party has been able to dupe the majority of people into thinking this was a Bush/Republican invention and that the poor Dems wanted to push for regulation but were blocked eludes me. It is a complete and utter lie. In fact, it was exactly the opposite. Wake up! People are so shortsighted in their historical perspective that they can't bear to remember the past decade! Anyway, I



Specific Connection to President Obama
This video shows President Obama's connection with Acorn and litigation against lenders who he deemed to be racist because they denied loans to unqualified borrowers, a majority of which were African Americans. I assert that if the numbers just aren't there, you should not be issued a loan. Think about it, if you think that lenders were discriminating based on race, you are basically saying that a lender would turn down profit for the sake of perpetuating racism. Personally, I think loan officers just see you as a dollar sign, and don't care about race. Let's face it... I'll set up a dichotomy either they are "filthy capitalist pigs that will do anything for a dollar" or they are "red-neck racists who happen to be wearing a suit and tie" (belying the fact that many of the loan officers are probably not Caucasian as well). You can't have it both ways, and I believe that the truth is neither of those choices. They are just people doing their jobs within the confines of the then-current legislation. Anyway, I have said my $0.02, watch the video :)

Thursday, February 5, 2009

Fear Mongering and Liberal Math

Take a look at this video clip of Nancy Pelosi fear-mongering. I don't want to make this a political blog site, but this is just the kind of idiotic fear-mongering that causes dangerous and expensive rushed decisions. If you want to give the Madame Speaker the benefit of the doubt, she made the same statement in a Chris Wallace interview on January 18th. Scare me once, shame on you, fool me twice shame on me...



"Every month that we do no have an economic recovery package, 500 million Americans lose their jobs" - Speaker of the House, Nancy Pelosi

I always thought there were only somewhere on the order of 300-305M Americans at the present time (at least according to the CIA and Census Bureau). So for her statement to be true, if we delay just one month, we would have to have 100% employment rate (that is, everyone has at least one job), and 83% of those people would have to have two jobs (again, I don't think this is true). Finally, in that first month, we would all have to lose our 1 or 2 jobs, that is 5/6ths of the jobs would have to be lost (I guess that includes her too, hopefully). Actually strictly speaking even that wouldn't work since she said 500M Americans would lose their jobs, not 500M American jobs would be lost. Semantics aside, so what would happen the next month? There would be no more American jobs to lose.... or is she assuming that we all have 4, or 6, or 8 jobs so we can stretch this out to 2, or 3 or 4 months?!?

Liberal math does not compute! Maybe that's why so many of the democrats in power have had recently-surfaced serious tax evasion problems. These are not little errors here and there, but often multi-year spanning, tens to hundreds of thousands of dollar "accidents" or "mistakes" made by the likes of Daschle, Geithner, Murphy, and Rangle. Tax evasion is another topic for another time though. I want to discuss something else while we are still on the sub-topic of unemployment.

Unemployment Timeline

Check out this unemployment timeline. Here are some interesting observations:

  • The timeline does not have unemployment data points for the Great Depression years, but estimates were 23-25% for several years. So everyone comparing this, finally official, recession to the Great Depression is slapping the face those who survived through the 1930s. In fact in the '80s recession and many years before it had unemployment in excess of 10.0% (we are estimating that the current 2009 rate is ~7.5%), so what we are in is not even as bad as the '80s and other year s with regard to employment.
  • For a lot of the 1990s and 2000s the unemployment rate was below 6.00. These years included three Bush terms (G.H.W. and G.W.) and two Clinton administrations, but I will let you decide who should get more acclaim for this.
  • After the 2nd Gulf War began, you can see a prolonged decline in unemployment down to pre 9/11 rates. I am not promoting nor am I discounting the war, I am just saying that the statistics coincide and are probably correlated.
  • It's interesting to see that the recent congress, the 110th Congress, had the Democrats as the majority party. It's since 100th's meeting in 2007 that there was a gradual rise in unemployment leading up to the current events. The majority lead has even increased in the succeeding congress, the 111th Congress.
  • Just following a minimum wage increase, you can usually see an increase in unemployment, sometimes prolonged. Not being an economist, I can only speculate that if the two are correlated, then it is due to the sudden increase in operating costs that require spending cuts (in the form of decreased labor quantity).

Sunday, February 1, 2009

Numerical Methods for Solving Initial Value Problems

Overview

I warned you that this blog would be random, and now we will discuss an undergraduate math topic. Recently, I was cataloging some of my old text books and I came across one on differential equations. The title is Differential Equations Computing and Modeling, by C. Henry Edwards and David E. Penney. It occurred to me that while I had done some small computation projects in this class using formulas in Excel and some MATLAB programs, most people do not have access to MATLAB (at least legally), though they should have access to a C or C++ compiler. So, without further ado, here is a small project (single source file) with multiple numerical methods for differential equation solution.

Introduction

First we need to define some terminology.
  • ODE - Ordinary differential equation. This is an equation which contains functions of only one independent variable, and derivatives with respect to that variable.
  • IVP - Initial value problem. This is an ODE, together with an initial condition. For example, dy/dx = y, y(0) = 1 is an IVP. (Coincidentally, if you solve this for y(1), you will determine Euler's costant e).
  • Derivative - What is wrong with you? Go back to Calculus 1. Sorry, that was an inside joke.
Next, in order to solve your problem using these numeric methods (as implemented), you need to express your differential equation such that y'=dy/dx is on one side. For example:

y' - y = x

should be expressed as

dy/dx = y + x

Once this is done, you will be implementing dy/dx as a function ( that's a c++ function here) which takes two doubles and returns a double. Specifically, the signature is as below:

// This is your differential equation y'(x,y)... that is
// formulate the function dy/dx as a function of x,y
typedef double (*Y_primed)(double x, double y);

So, to implement the example function that was given, you could do something like the following

// This is the example y', following the signature above
double sample_y_primed(double x, double y) {
return x + y;
}

The Numerical Methods

Numerical methods are used for solving differential equations in an approximate way. There are three numerical methods used in this source. Space will not allow me to cover them thoroughly here, but here are some Wikipedia resources. You can find the algorithms there as well.

  • Euler's Method - created by Leonhard Euler, mathematical genius. http://en.wikipedia.org/wiki/Euler_method
  • Improved Euler's Method - also called Heun's Method, after Karl M. W. L. Heund. This method is an extension of the Euler Method. http://en.wikipedia.org/wiki/Heun%27s_method
  • Runge-Kutta Method - A numerical method created by C. Runge and M.W. Kutta which has much lower error for a given number of steps, and thus has better accuracy for the same number of iterations as Euler's Method. Alternatively, a lot of computational efficiency can be gained at the expense of a little bit of accuracy when compared to Euler's Method. http://en.wikipedia.org/wiki/Runge-Kutta_methods

The implemented numerical methods all follow this prototype
double SomeNumericalMethod(Y_primed f, double initial_x, double initial_y, double final_x, unsigned int steps)

... where f, is the y' function covered earlier, initial_x is the x-value at which the initial condition is true, initial_y is the y value at the initial condition, final_x is the x-value at which to calculate the new y value, and steps is the number of iterations. The approximate new y-value, that is the approximate y(final_x), is returned.

Here is a small example:

dy/dx = x + y, y(0) = 1

We shall call ImprovedEulersMethod() function

double y_of_1 = ImprovedEulersMethod(sample_y_primed, 0, 1, 1, 10);

You should find that y_of_1 is near 3.4282, which is approximately equal to the real y-value y(1) = 3.4366. Increase the number of steps (20, 40, 80, ...) to see how the accuracy increases.

Downloading

Here's a link to the C++ source code.

Unexplored Ideas
  1. What is the error of each method as a function of the number of iterations used?
  2. How many iterations does it take to have n-decimal places accurate?
  3. Derive and prove why the special IVPs and final x-values for e, ln(2), and pi arrive at those values.