Chinook

Award-Winning Canadian Immigration and Refugee Law and Commentary Blog

Blog Posts

Could the Federal Court Have Avoided the Chinook Abuse of Court Process Tetralogy?

By Mariana Ruiz LadyofHats – the image i made myself using adobe ilustrator using this images as source: [1], [2] ,[3], [4] , [5],[6] .and a diagram found on the book “Pädiatrie” from Karl Heinz Niessen., Public Domain, https://commons.wikimedia.org/w/index.php?curid=860748

What Happened?

In the recent decision of [1] Ardestani v. Canada (Citizenship and Immigration) 2023 FC 874, part of a tetrology of cases where Federal Court justices were critical over attacks on IRCC’s Chinook system, Justice Aylen did not mince words.

She writes:

II. Preliminary Issue

[8] At the commencement of the hearing, counsel for the Applicant advised that he was relying on his written representations but requested that counsel for the Respondent answer five questions related to this matter. As I advised counsel for the Applicant, a hearing of an application for judicial review is not an examination for discovery. Counsel for the Respondent was under no obligation to answer his questions. Moreover, it was not open to the Applicant to raise new issues at the hearing of the application.

[9] I also raised with counsel for the Applicant the fact that two decision of this Court have recently been issued – Raja v Canada (Minister of Citizenship and Immigration), 2023 FC 719 and Haghshenas v Canada (Minister of Citizenship and Immigration), 2023 FC 464 – in which counsel for the Applicant made a number of the same arguments as raised in this application and which were all dismissed by this Court, twice. I asked counsel for the Applicant if he was continuing to pursue these issues notwithstanding the earlier findings of this Court and he indicated that he was.

[10] I find that counsel for the Applicant’s attempt to re-litigate such issues and to transform the hearing of this application into an examination for discovery constitutes an abuse of this Court’s processes. (emphasis added)

In the decision, Justie Aylen comments about the arguments made by applicant against Chinook:

[26] The Applicant asserts that his work permit application was processed using Chinook, which in and of itself is a breach of procedural fairness. Moreover, he asserts that the use of Chinook was improper given the importance of the decision at issue and the degree of complexity of the decision at issue (which involved business immigration). There is also no merit to these assertions. I am not satisfied that the use of Chinook, on its own, constitutes a breach of procedural fairness or that the nature of the application itself has any bearing on the use of Chinook. The evidence before the Court is that the decision was made by an Officer, with the assistance of Chinook. Whether or not there has been a breach of procedural fairness will turn on the particular facts of the case, with reference to the procedure that was followed and the reasons for decision [see Haghshenas, supra].

….

[34] The Applicant further asserts that the use of Chinook is “concerning”, suggesting essentially that any decision rendered in which Chinook was used cannot be reasonable. I see no merit to this suggestion. The burden rests on the Applicant to demonstrate that the decision itself lacks transparency, intelligibility and/or justification, and baseless musings about how Chinook was developed and operates does not, on its own, meet that threshold. (emphasis added)

While Justice Aylen discusses two other cases Raja and Hagshenas, there was also a third that was released just before Ardestani, in Zargar. All of these cases involve essentially an identical fact pattern of Iranian C11 applicants being refused work permits.

For the interests of summarizing the discuss of Chinook on each, and notwithstanding that I have only seen the full file record in Hagshenas (and have also written a past post, see: here), I will extract block quotes of what judges have said in each of the remaining three decisions about Chinook.

 

[2] Zargar v. Canada (Citizenship and Immigration), 2023 FC 905 (CanLII), <https://canlii.ca/t/jxxpc> – Justice McDonald, Dismissed

[12] Firstly, the allegations regarding: (a) the use of Chinook, (b) reasons only being provided after the judicial review Application was filed, and (c) the length of the processing time were fully canvassed in both Haghshenas v Canada (Citizenship and Immigration), 2023 FC 464 [Haghshenas] and Raja v Canada (Citizenship and Immigration), 2023 FC 719 [Raja]. In the absence of any specific evidence to support these allegations in this case, I adopt the analysis from those cases (Haghshenas paras 22-25, 28; Raja at paras 28-38) and can likewise conclude that the Applicant has not established any breach of procedural fairness on these grounds. (emphasis added)

Note that there was an apparent lack of evidence filed in Zargar.

 

[3] Raja v. Canada (Citizenship and Immigration), 2023 FC 719 (CanLII), <https://canlii.ca/t/jxfdq> – Justice Ahmed, Dismissed

[24] The Applicant submits that the Officer assessed his work permit application on the basis of irrelevant and extraneous criteria, but does not specify which criteria. The Applicant also submits that the IRCC’s reliance on Chinook, an efficiency-enhancing tool used to organize information related to applicants for temporary residence, undermines the reasonableness of the Officer’s decision.

….

B. Procedural Fairness

(1) Use of Chinook Processing Tool

[28] The Applicant submits that the Officer’s use of the Chinook processing tool to assist in the assessment of the application is procedurally unfair. The Applicant contends that the tool, which he claims is able to extract information from the GCMS for many applications at a time and generate notes about these applications in “a fraction of the time” it would take to review an application otherwise, results in a lack of adequate assessment of the Applicant’s work permit application.

[29] The Respondent submits that IRCC’s use of the Chinook tool to improve efficiency in addressing a voluminous number of temporary residence applications does not amount to a specific failure of procedural fairness in the Applicant’s case. The Respondent notes that the Applicant has failed to point to any evidence to support that the Officer’s use of the Chinook tool resulted in the omission of a key consideration in the assessment of his application or deprived him of the right to have his case heard. The Respondent contends that the Applicant’s submissions appear to be little more than an objection to IRCC’s use of this tool.

 

[30] I agree with the Respondent. While it was open to the Applicant to raise the ways that the Chinook processing tool specifically resulted in a breach of procedural fairness in the Officer’s assessment of his case, he has not provided any evidence of such a connection. I would also note that the Chinook tool is not intended to process, assess evidence, or make decisions on applications, and the Applicant has failed to raise any evidence countering this or demonstrating that the tool impacts the fairness of the decision-making process. (emphasis added)

Note – again there appears to be a lack of evidence filed. However I do take issues with the “Chinook tool is not intended to process, assess evidence’ portion. I think there is not enough on the record or in what IRCC has publicly shared to make that statement. It is a processing tool at the end of the day, so a processing tool does process and based on what we know about the modules work (especially Module 5’s risk indicators and local word flags) that it definitely assesses and ‘flags’ the evidence at the very least.

[4]Haghshenas v. Canada (Citizenship and Immigration), 2023 FC 464 (CanLII), <https://canlii.ca/t/jwhkd> – Justice Brown, Dismissed

[24] As to artificial intelligence, the Applicant submits the Decision is based on artificial intelligence generated by Microsoft in the form of “Chinook” software. However, the evidence is that the Decision was made by a Visa Officer and not by software. I agree the Decision had input assembled by artificial intelligence, but it seems to me the Court on judicial review is to look at the record and the Decision and determine its reasonableness in accordance with Vavilov. Whether a decision is reasonable or unreasonable will determine if it is upheld or set aside, whether or not artificial intelligence was used. To hold otherwise would elevate process over substance. (emphasis added)

….

[28] Regarding the use of the “Chinook” software, the Applicant suggests that there are questions about its reliability and efficacy. In this way, the Applicant suggests that a decision rendered using Chinook cannot be termed reasonable until it is elaborated to all stakeholders how machine learning has replaced human input and how it affects application outcomes. I have already dealt with this argument under procedural fairness, and found the use of artificial intelligence is irrelevant given that (a) an Officer made the Decision in question, and that (b) judicial review deals with the procedural fairness and or reasonableness of the Decision as required by Vavilov. (emphasis added)

What arises from the above is precious court resources were spent on four identical cases from the same counsel, making identical arguments, rendering nearly identical judgments all on summating ‘we do not have enough in front of us.’

One wonders if the Department of Justice should have just heeded Justice Little’s comments in Ocran v. Canada (Citizenship and Immigration), 2022 FC 175 (CanLII), <https://canlii.ca/t/jmk0l> to ask for a reference but as these tools are constantly evolving, I do understand the trepidation and costs:

V. Matters Raised by the Respondent

[57] The respondent raised additional matters for resolution by this Court about the preparation of GCMS notes generally by visa officers using spreadsheets made with a software-based tool known as the “Chinook Tool”. The respondent sought to resolve an issue about whether contents of Certified Tribunal Record (“CTRs”) were deficient because the spreadsheets are not retained and therefore do not appear in the CTRs prepared for matters such as this application. The respondent also purported to file an affidavit in an effort to provide a factual foundation; the applicant objected to its admissibility and relevance to the proceeding.

[58] In my view, the Court should not resolve the additional matters raised by the respondent on this application. There is no dispute or controversy between these parties […]

Read More »

Cautious Concern But Missing Crucial Context – Justice Brown’s Decision in Haghshenas

After the Federal Court’s decision in Ocran v. MCI (Canada) 2022 FC 175. it was almost inevitable that we would be talking again about Chinook. Counsel (including ourselves) have been raising the use of Chinook and the concerns of Artificial Intelligence in memorandums of argument and accompanying affidavits, arguing – for example – that many of the standard template language used fall short of the Vavilov standard and in many cases are non-responsive or reflective to the Applicant’s submissions.

We have largely been successful in getting cases consented on using this approach, yet I cannot say our overall success in resolving judicial reviews have followed suite. Indeed, recently we have been stuck at the visa office more on re-opening than we have been in the past.

Today, the Federal Court rendered a decision that again engaged in Chinook and in this case also touched on Artificial Intelligence. Many took to Twitter and Linkedin to express concern about bad precedent. Scholars such as Paul Daly also weighed in on Justice Brown’s decision, highlighting that there is simply a lot we do not know about how Chinook is deployed. 

I might take a different view than many on this case. While I think it might be read (and could be pointed to as precedent by the Department of Justice) as a decision upholding the reasonableness and fairness of utilizing Chinook and AI, I also think there was no record that tied in how the process affects the outcome, clearly the link that Justice Brown was concerned about.

Haghshenas v. Canada (MCI) 2023 FC 464

Mr. Haghshenas had his C-11 (LMIA exempt) work permit refused on the basis that he would not leave Canada at the end of his authorized stay pursuant to subsection 200(1) of the IRPR. It is interesting that in the Certified Tribunal Record and specifically the GCMS notes, there is no mention of Chinook 3+ as is commonly disclosed now. However, there is the wording of Indicators (meaning risk indicators) as N/A and Processing Word Flag as N/A. These are Module 5 flags, that make up one of the columns in the Chinook spreadsheet, so it is presumable that Chinook could have been used. However, we do note the screenshots that were part of the CTR do not appear to include the Chinook tab or any screenshot of what Chinook looked at. From the record, this lack of transparency on what tool was actually used did not appear to be challenged.

Ultimately, the refusal decision itself is actually quite personalized – not carrying the usual pure template characteristics of Module 4 Refusal Notes generator. There is personalized assessment of the actual business plan, the profits considered (and labelled speculative by the Officer), and concerns about whether registration under the licensed contractor process has been done. From my own experiences, this decision seems quite removed from the usual Module 3 and perhaps suggests that either Chinook was not fully engaged OR that the functionality of Chinook has gotten much better to the point where it’s use becomes blurred. It could reasonably be both.

In upholding the procedural fairness and reasonableness of the decision, Justice Brown does engage in two areas about a discussion of Chinook and AI.

In dismissing the Applicant’s argument on procedural fairness, Justice Brown writes:

[24] As to artificial intelligence, the Applicant submits the Decision is based on artificial intelligence generated by Microsoft in the form of “Chinook” software. However, the evidence is that the Decision was made by a Visa Officer and not by software. I agree the Decision had input assembled by artificial intelligence, but it seems to me the Court on judicial review is to look at the record and the Decision and determine its reasonableness in accordance with Vavilov. Whether a decision is reasonable or unreasonable will determine if it is upheld or set aside, whether or not artificial intelligence was used. To hold otherwise would elevate process over substance.

He writes later, under the reasonableness of decision, heading:

[28] Regarding the use of the “Chinook” software, the Applicant suggests that there are questions about its reliability and efficacy. In this way, the Applicant suggests that a decision rendered using Chinook cannot be termed reasonable until it is elaborated to all stakeholders how machine learning has replaced human input and how it affects application outcomes. I have already dealt with this argument under procedural fairness, and found the use of artificial intelligence is irrelevant given that (a) an Officer made the Decision in question, and that (b) judicial review deals with the procedural fairness and or reasonableness of the Decision as required by Vavilov.

Justice Brown appeared to be concerned with the lack of the Applicant’s tying of the process of utilizing artificial intelligence or Chinook to how it actually impacted the reasonableness or fairness of the decision. Justice Brown is looking at the final decision and correctly suggests – an Officer made it, the Record justifies it – how it got from A to C is not the reviewable decision it is the A of the input provided to the Officer and the C of the Officer’s decision.

I want to question about the missing B – the context.

It is interesting to note also, in looking at the Record, that the Respondent (Minister) did not engage in any discussion of Chinook or AI. The argument was solely raised by the Applicant – in two paragraphs in the written memorandum of argument and one paragraph in the reply. The Applicant’s argument, one rejected by Justice Brown, was that the uncertainty of the reliability, efficacy, and lack of communication created an uncertainty of how these tools were used, which ultimately impacted the fairness/reasonableness.

The Applicant captures these arguments in paragraphs 9, 10 , and 32 of their memorandum, writing:

The nature of the decision and the process followed in making it

9. While the reason originally given to the Applicant was that the visa officer (the
decision maker) believed that the Applicant would not leave Canada based on the
purpose of visit, the reasons now given during these proceedings reveal that the
background rationale of the decision maker does not support refusal based on
purpose of visit. In fact, the application was delayed for nearly five months and in
the end the decision was arrived at with the help of Artificial Intelligence
technology of Chinook 3+. It is not certain as to what information was analysed
by the aforesaid software and what was presented to the decision maker to
make up a decision. It can be presumed that not enough of human input has
gone into it, which is not appropriate for a complicated case involving business
immigration. It is also not apt in view of the importance of the decision to the
individual, who has committed a great deal of funds for this purpose. (emphasis added)

10. Chinook is a processing tool that it developed to deal with the higher volume of
applications. This tool allows DMs to review applications more quickly.
Specifically, the DM is able to pull information from the GCMS system for many
applications at the same time, review the information and make decisions and
generate notes  in using a built-in note generator, in a fraction of the time it
previously took to review the same number of applications. It can be presumed
that not enough human input has gone into it, which is not appropriate for a
complicated case involving business immigration. In the case at hand, Chinook
Module 5- indicator management tool was used, which consists of risk indicators
and local word flags. A local word flag is used to assist in prioritizing applications.
It is left up to Chinook to search for these indicators and flags and create a
report, which is then copy and pasted into GCMS by the DM. The present case is
one that deserved priority processing being covered by GATS. Since the
appropriate inputs may not have been fed into the mechanised processes of
Chinook, which would flag priority in suchlike GATS cases, the DM¶s GCMS
notes read 3processing priority word flag: N/A . This is clearly wrong and betrays
the fallout in using technology to supplant human input. The use of Chinook has
caused there to be a lack of effective oversight on the decisions being generated.
It is also not apt in view of the importance of the decision to the individual, who
has committed a great deal of funds for this purpose (Baker supra). (emphasis added)

32. On the issue of Chinook, while it can be believed that faced with a large volume of
cases, IRCC has been working to develop efficiency-enhancing tools to assist
visa officers in the decision-making process. Chinook is one such tool. IRCC has
been placing heavy reliance on it for more than a year now. However, as always
with use of any technology, there are questions about its reliability and efficacy for
the purpose it sets out to achieve. There are concerns about the manner in which
information is processed and analysed. The working of the system is still unclear
to the general public. A decision rendered using it cannot be termed reasonable until it is elaborated to all stakeholders to what extent has machine replaced human input and how it impacts the final outcome. The test set by the Supreme Court in Vavilov has not been met.

The Applicant appeared to be almost making an argument that the complexity of the Applicant’s case suggested Chinook should not have been used and therefore a human should have reviewed it. However – there seemed to have been a gap in engaging both the fact that IRCC did not indicate it had used Chinook and that the reasons actually were more than normally responsive to the facts. I think also, the argument that a positive world flag should have been implemented but was not, ultimately did not get picked up the Court – but lacked a record of affidavit evidence or a challenge to the CTR […]

Read More »

Three Belated Crystal Ball Predictions for Canadian Immigration in 2022

While March may seem for some a little late to be predicting a year’s events (given Q1 is nearing it’s end), I will take a contrarian position that is not. Right now is perhaps the perfect time to try and make a prediction. All the big picture pieces are out of the way. We know what the levels plan looks like, especially in terms of the reduction of CECs landed in 2022.

Prediction 1: 2022 will be about AI vs. IA

I believe IRCC is full throttle trying to implement AI (Advanced Analytics) across all their Lines of Business (LOBs), from temporary to permanent residence to citizenship. The speed by which artificial intelligence can be implemented with public support to process high volume of applications to Canada will be pitted against the impact of international affairs/crises/refugee producing situation.  If Ukraine is the new precedent set by IRCC to tackle refugee/humanitarian wars and crises, which politically appears it will have to be – so that the Government can appear anti-racist (see prediction 3), this will inevitably delay/shift resources. If AI can be quickly implemented to deal with the quick decisions (both approvals/refusals) this might be the best solution for the Government. Meanwhile those who are more critical of AI systems (myself included) might ask for more caution in the process;

Email headings between senior A2SC (Advanced Analytics) folks, received via ATIP

 

Prediction 2: TEERing Up the Economic Immigration System Will Leave Some Behind 

The new TEER system replaces NOC in a year where economic permanent resident applications, largely filled by NOC B positions, are backlogged and paused. How will IRCC adapt and change the rules of the game, with the implementation of TEER. What does this mean for the future of FSW/CEC?

If the math is as it is above, we could see a shrinking of NOC 0AB so that that the 70% of unit groups once eligible (NOC 0, A,B) turns into 59% (Tier 0, 1, 2). While it seems like a lot of what will occur will be ‘mergers’, I am eager to see what happens to tweener jobs such as administrative assistant and retail sales supervisor. I suspect, the first place we will see a major impact will be in the Federal Skilled Worker where we may move to exclusion lists or targeted draws for specific TEER categories.

 

Prediction 3: IRCC will be forced/asked to clean up the house on anti-racismIRCC’s Anti-Racism Polaris Report and recent concerns (including the next Parliamentary Study) about discrepant processing rates will lead the Department to try and address this in policy options and offerings.

The emails between IRCC staff looking into preventing bias and anti-racism in systems is good work in the right direction, but growing calls will be for an independent oversight commission or ombudsperson.

Immigration is so deeply entrenched with racist roots from our history of exclusion, now manifested in explicit and implicit biases, two tiered systems, secret programs, and different criteria, that I really do not see how we can build an anti-racist system without first tearing down the first one. Economically (in terms of investment into things such as technology) and politically (given we are still considered globally to have a decent/attractive system), I don’t see us doing that.

What you will likely see is a greater platforming and emphasis of the Gender Based Analysis (GBA+) work as well as projects taken up that give at least a cover or presentation of progress. Yet, myself and other critiques are still hopeful that the Government does not shy away from a hard, introspective, look at the systems that have already been developed and paid for to see where key fixes are needed.

I do see that those on the other side, advocates, lawyers, etc. are shifting away from their own Whiteness and once those litigation skills and experiences are transferred to the new generation of racialized lawyers who have a keen sense of justice and have lived/feel the discrepancy, they will start attacking the foundations. I think right now is the perfect time for IRCC to do some public relations/communication work around anti-racism, to pad the intention piece, and build in justifications/explanations/evidence for when these matters eventually get litigated.

As I have presented and said – immigration is itself state-sponsored discrimination. I don’t think we will ever eliminate it to a point where Applicants are happy and Immigration loses its role as a filtering mechanism based on race + citizenship, as a defining feature. Yet, I definitely see a bigger role for those who advocate for safeguards and 2022 as the year some of those safeguards start being introduced.

Read More »

Chinook is AI – IRCC’s Own Policy Playbook Tells Us Why

One of the big debates around Chinook is whether or not it is Artificial Intelligence (“AI”). IRCC’s position has been that Chinook is not AI because there is a human ultimately making decisions.

In this piece, I will show how the engagement of a human in the loop is a red herring, but also how the debate skews the real issue that automation, whether for business function only or to help administer administrative decision, can have adverse impacts – if unchecked by independent review.

The main source of my argument that Chinook is AI is from IRCC itself – the Policy Playbook on Automated Support on Decision-Making 2021. This an internal document, which has been updated yearly, but likely captures the most accurate ‘behind the scenes’ snapshot of where IRCC is heading. More on that in future pieces.

AI’s Definition per IRCC

The first, and most important thing is to start with the definition of Artificial intelligence within the Playbook.

The first thing you will notice is that the Artificial Intelligence is defined so broadly by IRCC, which seems to go against the narrow definition it seems to paint with respect to defining Chinook.

Per IRCC, AI is:

If you think of Chinook dealing with the cognitive problem of attempting to issue bulk refusals – and utilizing computer science (technology) – to apply to learning, problem solving and pattern recognition – it is hard to imagine that a system would even be needed if it weren’t AI.

Emails among IRCC, actively discuss the use of Chinook to monitor approval and refusal rates utilizing “Module 6”

Looking at the Chinook Module’s themselves, Quality Assurance (“QA”) is built in as a module. It is hard to imagine a QA system that looks at refusal and approval rates, and automates processes and is not AI.

As this article points out:

Software QA is typically seen as an expensive necessity for any development team; testing is costly in terms of time, manpower, and money, while still being an imperfect process subject to human error. By introducing artificial intelligence and machine learning into the testing process, we not only expand the scope of what is testable, but also automate much of the testing process itself.

Given the volume of files that IRCC is dealing with, it is unlikely that the QA process relies only on humans and not technology (else why would Chinook be implemented). And if it involves technology and automation (a word that shows up multiple times in the Chinook Manual) to aid the monitoring of a subjective administrative decision – guess what – it is AI.

We also know also that Chinook is underpinned with ways to process data, look at historical approval and refusal rates, and flag risks. It also integrates with Watchtower to review the risk of applicants.

It is important to note that even in the Daponte Affidavit in Ocran that alongside ATIPs is the only information we have about Chinook, the focus has always been on the first five modules. Without knowledge of the true nature of something like Module 7 titled ‘ToolBox’ it is certainly premature to be able to label the whole system as not AI.

 

Difficult to Argue Chinook is Purely Process Automation Given Degree of Judgment Exercised by System in Setting Up Findecs (Final Decisions)

Where IRCC might be trying to carve a distinction is between process automation/digital transformation and automated decision support systems.

One could argue, for example, that most of Chinook is process automation.

For example, the very underpinning of Chinook is it allows for the entire application to be made available to the Officer in one centralized location, without opening the many windows that GCMS required. Data-points and fields auto populate from an application and GCMS into a Chinook Software, allowing the Officer to render decisions easier. We get this. It is not debatable.

But does it cross into automated decision support system? Is there some degree of judgment that needs to be applied when applying Chinook that is passed on to technology that would traditionally be done by humans.

As IRCC defines:

The Chinook directly assists an Officer in approving or refusing a case. Indeed, Officers have to apply discretion in refusing, but Chinook presents and automates the process. Furthermore, it has fundamentally reversed the decision-making processing, making it a decide first, justify later approach with the refusal notes generator. Chinook without AI generating the framework, setting up the bulk categories, automating an Officer’s logical reasoning process, simply does not exist.

These systems replace the process of Officer’s  needing to manually review documents and render a final decision, taking notes to file, to justify their decision. It is to be noted that this is still the process at low volume/Global North visa offices where decisions do this and are reflected in the extensive GCMS notes.

In Chinook, any notes taken are hidden and deleted by the system, and a template of bulk refusal reasons auto-populate, replace, and shield the actual factual context of the matter from scrutiny.

Hard to see how this is not AI. Indeed, if you look at the comparables provided – the eTA, Visitor Record and Study Permit Extension automation in GCMS, similar automations with GCMS underpin Chinook. There may be a little more human interaction, but as discussed below – a human monitoring or implementing an AI/advanced analytics/triage system doesn’t remove the AI elements.

 

Human in the Loop is Not the Defining Feature of AI

The defense we have been hearing from IRCC is that there is a human ultimately making a decision, therefore it cannot be AI.

This is obscuring a different concept called human-in-the-loop, which the Policy Playbook suggests actually needs to be part of all automated decision-making processes. If you are following, what this means is the defense of a human is involved (therefore not AI), is actually a key defining requirement IRCC has placed on AI-systems.

It is important to note that there is certainly is a spectrum of application of AI at IRCC that appears to be leaning away from human-in-the-loop. For example, IRCC has disclosed in their Algorithmic Impact Assessment (“AIA”) for the Advanced Analytics Triage of Overseas Temporary Resident Visa (“TRV”) Applications that there is no human in the loop with the automation of Tier 1 approvals. The same system without a human-in-the-loop is done for automating eligibility approvals in the Spouse-in-Canada program, which I will write about shortly.

 

Why the Blurred Line Between Process Automation and Automated Decision-Making Process Should Not Matter – Both Need Oversight and Review

Internally, this is an important distinguishing characteristic for IRCC because it appears that at least internal/behind-the-scenes strategizing and oversight (if that is what the Playbook represents) applies only to automated decision-support systems and not business automations. Presumably such a classification may allow for less need for review and more autonomy by the end user (Visa Officer).

From my perspective, we should focus on the last part of what IRCC states in their playbook – namely that ‘staff should consider whether automation that seems removed from final decisions may inadvertently contribute to an approval or a refusal.’

To recap and conclude, the whole purpose of Chinook is to be able to render the approval and refusal in a quicker and bulk fashion to save Officer’s time. Automation of all functions within Chinook, therefore, contribute to a final decision – and not inadvertently but directly. The very manner in which decisions are made in immigration shifts as a result of the use of Chinook.

Business automation cannot and should not be used as a cover for the ways that what appear routine automations actually affect processing that would have had to be done by humans, providing them the type of data, displaying it on the screen, in a manner that can fetter their discretion and alter the business of old.

That use of computer technology – the creation of Chinook – is 100% definable as the implementation of AI.

 

Read More »
About Us
Will Tao is an Award-Winning Canadian Immigration and Refugee Lawyer, Writer, and Policy Advisor based in Vancouver. Vancouver Immigration Blog is a public legal resource and social commentary.

Let’s Get in Touch

Translate »