Is there any procedure for QA'ing an airline manufacturer's code?
DO-178C requires developing a rigourous and complex quality assurance process. This is handled largely through both internal auditors and QA engineers, who are not just the software designers wearing different hats. It's also handled through designated certification representatives who do audits, review certification submission documents, and approve unresolved bug/problem reports. Many companies seem to follow standards such as ISO-9000 or CMMI.
This QA process must include audits as explained in FAA Order 8110.49A, called "stage of involvement" audits. These happen throughout the development lifecycle, not just on the final product, and cover, well, everything from whether your planning is adequate to whether requirements match the code to whether your tests meet standards.
Does someone at the FAA get invited to Boeing's private Github account?
What's actually submitted to the FAA is limited, in part because the FAA's main goal during certification is to ensure the airplane was developed with a high-quality process. The FAA relies a lot on the internal auditors and certification liasons, who have access to the code and requirements. The FAA does see a lot of high-level reports like TSO submittals, a Software Accomplishment Summary, System Safety Analysis, etc. They also see a list of any open bug reports that actually affect the cockpit. Finally, the FAA is involved in a limited way in the flight testing of the plane.
The FAA rarely reviews the actual code. As selectstriker2 pointed out in his/her answer, a lot of aircraft code is considered trade secrets, and some considerations like export controls may apply. Even avionics suppliers and airframe developers are reluctant to share too much data with each other to prevent trade secret theft.
I can't locate a single patent application that even mentions what language it's written in
Vejo Quais linguagens de programação são usadas para equipamentos a bordo de aeronaves?. I understand your frustration as I've found it hard to research technology other companies are using myself.
Outros pontos
DO-178 has some spots where it relies on the certifying company to be responsible, despite the fact that financial incentives exist to produce "just good enough" software. For example the FAA assumes the company has good design (including human factors), adequate training, strong investment in QA with limited corner-cutting, and no deliberate misrepresentations.
Most importantly, the FAA aims to ensure the software development processo is unlikely to produce uncaught errors, not that the software itself is completely bug-free. Many news reports about the recent 737 Max crash sensationalize this as "delegation" or even "self-auditing" but that's how the FAA has always handled avionics- they can't exhaustively test every feature themselves, and largely rely on the manufacturer to decide the complexities of what is and isn't safe. While the FAA has some recognized design standards like TSO's, these standards are very general. In some ways this makes sense- the FAA can't get caught in a cat-and-mouse game of updating their own design requirements for every new variation or feature in avionics.
Among the other certification requirements that apply to your question:
- A safety analysis with calculations or models showing
that issues (say, undetected nose-down below 1000 ft AGL) are not
more likely to occur than there severity requires. These models are often based on redudancy levels and malfunction probabilities (e.g. both AOA sensors will only fail once per 10^-9 flight hours) (see ARP4761).
- Both system and software requirements, with corresponding system and
software tests. As an analogy, you're making sure the house is not
only built according to blueprints but that the blueprints make sense
for a house. This relates to the point @user71659 brought up about validating the design in context of the plane and not just verifying the code has no errors.
- Thorough testing, including testing safety features against sensor failures, testing lines of code to an MCDC coverage level, and structural coverage testing to ensure unintended interactions don't occur
- Configuration management, including change review and problem tracking. If you're following DO-178C you can't merge in an unreviewed code change or hide a bug report
- Tool qualification to make sure the build environment, automatic tests, etc. are well-designed and stable. See DO-330 for more.
This is a wide topic and there's a variety of certification documents that apply to you question, including DO-178C, ARP4754A, and ARP4761. Whole books have been written on industry practices so if you wish to understand this question in detail I'd suggest you pick up one (my go-to is Leanna Rierson's "Developing Safety Critical Software").