Cautious Concern But Missing Crucial Context – Justice Brown’s Decision in Haghshenas

After the Federal Court’s decision in Ocran v. MCI (Canada) 2022 FC 175it was almost inevitable that we would be talking again about Chinook. Counsel (including ourselves) have been raising the use of Chinook and the concerns of Artificial Intelligence in memorandums of argument and accompanying affidavits, arguing – for example – that many of the standard template language used fall short of the Vavilov standard and in many cases are non-responsive or reflective to the Applicant’s submissions.

We have largely been successful in getting cases consented on using this approach, yet I cannot say our overall success in resolving judicial reviews have followed suite. Indeed, recently we have been stuck at the visa office more on re-opening than we have been in the past.

Today, the Federal Court rendered a decision that again engaged in Chinook and in this case also touched on Artificial Intelligence. Many took to Twitter and Linkedin to express concern about bad precedent. Scholars such as Paul Daly also weighed in on Justice Brown’s decision, highlighting that there is simply a lot we do not know about how Chinook is deployed. 

I might take a different view than many on this case. While I think it might be read (and could be pointed to as precedent by the Department of Justice) as a decision upholding the reasonableness and fairness of utilizing Chinook and AI, I also think there was no record that tied in how the process affects the outcome, clearly the link that Justice Brown was concerned about.

Haghshenas v. Canada (MCI) 2023 FC 464

Mr. Haghshenas had his C-11 (LMIA exempt) work permit refused on the basis that he would not leave Canada at the end of his authorized stay pursuant to subsection 200(1) of the IRPR. It is interesting that in the Certified Tribunal Record and specifically the GCMS notes, there is no mention of Chinook 3+ as is commonly disclosed now. However, there is the wording of Indicators (meaning risk indicators) as N/A and Processing Word Flag as N/A. These are Module 5 flags, that make up one of the columns in the Chinook spreadsheet, so it is presumable that Chinook could have been used. However, we do note the screenshots that were part of the CTR do not appear to include the Chinook tab or any screenshot of what Chinook looked at. From the record, this lack of transparency on what tool was actually used did not appear to be challenged.

Ultimately, the refusal decision itself is actually quite personalized – not carrying the usual pure template characteristics of Module 4 Refusal Notes generator. There is personalized assessment of the actual business plan, the profits considered (and labelled speculative by the Officer), and concerns about whether registration under the licensed contractor process has been done. From my own experiences, this decision seems quite removed from the usual Module 3 and perhaps suggests that either Chinook was not fully engaged OR that the functionality of Chinook has gotten much better to the point where it’s use becomes blurred. It could reasonably be both.

In upholding the procedural fairness and reasonableness of the decision, Justice Brown does engage in two areas about a discussion of Chinook and AI.

In dismissing the Applicant’s argument on procedural fairness, Justice Brown writes:

[24] As to artificial intelligence, the Applicant submits the Decision is based on artificial intelligence generated by Microsoft in the form of “Chinook” software. However, the evidence is that the Decision was made by a Visa Officer and not by software. I agree the Decision had input assembled by artificial intelligence, but it seems to me the Court on judicial review is to look at the record and the Decision and determine its reasonableness in accordance with Vavilov. Whether a decision is reasonable or unreasonable will determine if it is upheld or set aside, whether or not artificial intelligence was used. To hold otherwise would elevate process over substance.

He writes later, under the reasonableness of decision, heading:

[28] Regarding the use of the “Chinook” software, the Applicant suggests that there are questions about its reliability and efficacy. In this way, the Applicant suggests that a decision rendered using Chinook cannot be termed reasonable until it is elaborated to all stakeholders how machine learning has replaced human input and how it affects application outcomes. I have already dealt with this argument under procedural fairness, and found the use of artificial intelligence is irrelevant given that (a) an Officer made the Decision in question, and that (b) judicial review deals with the procedural fairness and or reasonableness of the Decision as required by Vavilov.

Justice Brown appeared to be concerned with the lack of the Applicant’s tying of the process of utilizing artificial intelligence or Chinook to how it actually impacted the reasonableness or fairness of the decision. Justice Brown is looking at the final decision and correctly suggests – an Officer made it, the Record justifies it – how it got from A to C is not the reviewable decision it is the A of the input provided to the Officer and the C of the Officer’s decision.

I want to question about the missing B – the context.

It is interesting to note also, in looking at the Record, that the Respondent (Minister) did not engage in any discussion of Chinook or AI. The argument was solely raised by the Applicant – in two paragraphs in the written memorandum of argument and one paragraph in the reply. The Applicant’s argument, one rejected by Justice Brown, was that the uncertainty of the reliability, efficacy, and lack of communication created an uncertainty of how these tools were used, which ultimately impacted the fairness/reasonableness.

The Applicant captures these arguments in paragraphs 9, 10 , and 32 of their memorandum, writing:

The nature of the decision and the process followed in making it

9. While the reason originally given to the Applicant was that the visa officer (the
decision maker) believed that the Applicant would not leave Canada based on the
purpose of visit, the reasons now given during these proceedings reveal that the
background rationale of the decision maker does not support refusal based on
purpose of visit. In fact, the application was delayed for nearly five months and in
the end the decision was arrived at with the help of Artificial Intelligence
technology of Chinook 3+. It is not certain as to what information was analysed
by the aforesaid software and what was presented to the decision maker to
make up a decision. It can be presumed that not enough of human input has
gone into it, which is not appropriate for a complicated case involving business
immigration. It is also not apt in view of the importance of the decision to the
individual, who has committed a great deal of funds for this purpose. (emphasis added)

10. Chinook is a processing tool that it developed to deal with the higher volume of
applications. This tool allows DMs to review applications more quickly.
Specifically, the DM is able to pull information from the GCMS system for many
applications at the same time, review the information and make decisions and
generate notes  in using a built-in note generator, in a fraction of the time it
previously took to review the same number of applications. It can be presumed
that not enough human input has gone into it, which is not appropriate for a
complicated case involving business immigration. In the case at hand, Chinook
Module 5- indicator management tool was used, which consists of risk indicators
and local word flags. A local word flag is used to assist in prioritizing applications.
It is left up to Chinook to search for these indicators and flags and create a
report, which is then copy and pasted into GCMS by the DM. The present case is
one that deserved priority processing being covered by GATS. Since the
appropriate inputs may not have been fed into the mechanised processes of
Chinook, which would flag priority in suchlike GATS cases, the DM¶s GCMS
notes read 3processing priority word flag: N/A . This is clearly wrong and betrays
the fallout in using technology to supplant human input. The use of Chinook has
caused there to be a lack of effective oversight on the decisions being generated.
It is also not apt in view of the importance of the decision to the individual, who
has committed a great deal of funds for this purpose (Baker supra). (emphasis added)

32. On the issue of Chinook, while it can be believed that faced with a large volume of
cases, IRCC has been working to develop efficiency-enhancing tools to assist
visa officers in the decision-making process. Chinook is one such tool. IRCC has
been placing heavy reliance on it for more than a year now. However, as always
with use of any technology, there are questions about its reliability and efficacy for
the purpose it sets out to achieve. There are concerns about the manner in which
information is processed and analysed. The working of the system is still unclear
to the general public. A decision rendered using it cannot be termed reasonable until it is elaborated to all stakeholders to what extent has machine replaced human input and how it impacts the final outcome. The test set by the Supreme Court in Vavilov has not been met.

The Applicant appeared to be almost making an argument that the complexity of the Applicant’s case suggested Chinook should not have been used and therefore a human should have reviewed it. However – there seemed to have been a gap in engaging both the fact that IRCC did not indicate it had used Chinook and that the reasons actually were more than normally responsive to the facts. I think also, the argument that a positive world flag should have been implemented but was not, ultimately did not get picked up the Court – but lacked a record of affidavit evidence or a challenge to the CTR […]