Canadian Immigration Law Blog

Award-Winning Canadian Immigration and Refugee Law and Commentary Blog

Blog Posts

Five AI-Decision Making Questions We Need Answers To From IRCC

In this short post, I will canvass five relatively urgent questions we need the collective answers to as we represent clients who are now being addressed by artificial-intelligence built decision-making systems. For clarity and to adopt IRCC’s status quo, I will not consider Chinook to be one of those systems, BUT it is clear Chinook interacts with AI and the role of Chinook as it pertains to decisions, especially as advanced analytics skips eligibility assessment become increasingly more important.

1) If IRCC is basing Advanced Analytics decisions of historical data, what historical data is being utilized? Does it represent a reasonable/ideal officer and how can it be re-programmed?

How do we ensure it represents an ideal period (not a stressed officer/overburdened)? IRCC has been overburdened with applications for the last decade having to create systems to shortcut decision-making and has been openly acknowledging their resource crunch. If historical data does not represent what we want for future processing – how can projections be changed. How, in practice, does bias get stripped or de-programmed out of data? We have seen positive impacts (for example Nigerian study permit approval rates) since recent advocacy but is that programmed in manually by a human? and how?

2) How does Advanced Analytics interact with Chinook?

In the past Chinook was utilized for only a portion of cases, we understand to both bulk approve cases and bulk refuse. If Advanced Analytics serves to provide auto-positive eligibility, why is Chinook even needed to sort the Applicant’s information to decide whether to approve or refuse. Is there column in Chinook that allows an Officer to see if Eligibility has already been met (i.e. it was AA’d) and therefore altering their application and use of Chinook? The fear is Chinook becomes just a refusal tool and is no longer needed for approvals.

Furthermore, what does an Officer see when they have to perform eligibility assessment? Are they given any information about data trends/key risk indicators/etc. that Advanced Analytics helped generate presumably during the triage? Is it something the Officer has to dig for in separate module of Chinook or is it displayed right in their face as they render a decision to remind them?

Are Officer’s made aware if a case goes into manual review for example as QA for an Automated Decision? How are those cases tracked?

3) What is the incentive to actually process a non-AA decision if AA decisions can be processed more accurately/quickly?

For those files that are triaged to the non-Green/Human bin, if it becomes a numbers game and the situation is no longer ‘first in, first out’, why even process the complex cases anymore? Why not fill the slots with newer AA/low risk cases that will create less challenges and just let decisions that are complicated or require human intervention to set for one, two years until the Applicant seeks a withdrawal? Other than mandamus, what remedies will Applicants have to resolve their cases. It is simply about complaining hard enough to get pulled out of review and for an eventual refusal? How do we ensure we do not refuse all Tier 2/3 cases as a matter of general practice as we get more Tier 1 applications in the door (likely from visa-exempt, Global North countries).

4) What does counsel for the Department of Justice see in GCMS/Rule 9 Reasons versus what we see?

Usually, the idea of a tribunal record or GCMS is that it a central record of an Applicant’s file but with increasing redactions, it is becoming less and less clear who has access to what information. Client’s are triaged utilizing “bins” but those bins are stripped from the GCMS notes we get. Are they also stripped for DOJ or not? Right now local word flags and risk indicators are stripped for applicants, but are they also stripped for DOJ? What about the audit trail that exists for each applicant that we have not been able to obtain via ATIP?

Taking it a step further – what constitutes a Tribunal Record anymore? Is it only what was submitted by the Applicant and what is in the Officer’s final decision? I know my colleague, Steven Meurrens has started to get even email records between Officers, but there’s a lack of clarity on what that Tribunal Record consists of and whether it necessarily must include the audit trail, risk indicators, and local word flags. Should it include the algorithms?

How does one even try to make fettering arguments if we do not know what the Officer had access to before rendering a decision (how they were possibly fettered)?

The other question becomes how do we let the judiciary know about these systems? Does it go up as a DOJ-led reference (and who can intervene and be on the other side)? The strategic litigation likely will be implemented again in a weak fact case. How do we ensure counsel on the other side is prepared for this so they can not only fight back but provide a counternarrative to the judiciary on these issues?

5) Will the Triaging Rules ever be Made Public? 

Currently, the AI is quite basic from our understanding. There are key rules inputted and applications that meet the requirements go through a decision-tree that leads to auto-eligibility approvals. However, as these AA programs adopt more machine learning components, allowing them to spot out and sniff out new flags, new rules, new issues – will there be some transparency around what the rules are? Should there be different treatment between rules that are more on the security/intelligence/system integrity side versus more black and white rules such as only individual applicants can get tier one processing, or applicant’s must not have had a previous refusal to benefit from X, or holding a U.S. visa or previous Canadian visa over past ten years is a Tier 1 factor.

If the ultimate goal is also to use these rules to try and affect processing (lower number of applicants and raise approvals), presumably telling the public these factors so they may be dissuaded from applying when they do not have a strong case could be of benefit.

Just some random Monday morning musings as we dig further. Stay tuned.

Read More »

How Much More Likely is an SDS Study Permit to Get Approved Than a Non-SDS Study Permit? – A Stats Look

One of the common questions we get asked by applicants (and indeed rumours fly around constantly on) is whether it makes sense to pursue IRCC’s Student Direct Stream or just go the regular route.

I recently obtained data from an IRCC requests that helps contextualize this question a bit. I decided (for interest of trying to make the data easier to understand) to just look at January to August 2022. This sample size necessarily limits our analysis, but I think it gives us a good microcosm to examine. January to August 2022 is not hindered (as much) by the COVID-19 restrictions of 2019-2020 and 2021 was for all intents and purposes a ‘straddle’ year.

This investigation is important because there have been rumours and allegations for example – that India SDS is not worth the effort (and that locally decided non-SDS cases have a higher refusal rate) or that for Philippines applicants, SDS is pretty much a non-effective process.

Without further ado, here is the raw data. Remember I did not (for purposes of visualization) break down the actual numbers of applications and did not do an ‘averaging’ because it depends on actual total numbers, which will take a bit more time to calculate with the way data was presented.

via IRCC CDO Approval % SDS/NSE by Country of Residence/Citizenship
Jan-22 Feb-22 Mar-22 Apr-22 May-22 Jun-22 Jul-22 Aug-22
India 72% 67% 69% 64% 60% 55% 57% 62%
Nigeria 61% 60% 68% 76% 91% 92% 63% 91%
China 86% 61% 58% 77% 83% 83% 76% 88%
Philippines 40% 40% 38% 50% 53% 40% 46% 48%
Vietnam 79% 82% 82% 66% 62% 74% 71% 82%
Pakistan 25% 40% 43% 40% 59% 56% 77% 67%
Approval % of Non-SDS/NSE by Country of Residence/Citizenship
via IRCC CDO Jan-22 Feb-22 Mar-22 Apr-22 May-22 Jun-22 Jul-22 Aug-22
India 12% 25% 24% 20% 36% 38% 35% 42%
Nigeria 44% 34% 26% 30% 31% 34% 69% 63%
China 74% 48% 72% 78% 82% 84% 90% 82%
Philippines 57% 58% 55% 82% 75% 84% 77% 76%
Vietnam 58% 51% 59% 79% 72% 61% 81% 55%
Pakistan 24% 17% 44% 17% 36% 48% 37% 38%

I have a few big takeaways:

  1. Philippines SDS is the only SDS that has an approval rate that is significantly and consistently below Non-SDS. LJ Dangzalan has been talking about this a ton, but numbers back this up;
  2. The India Non-SDS rumour appears just that. It may be select cases or ‘overselling’ local services but numbers don’t back that up;
  3. Pakistan SDS makes a big difference (and the last four months) show it; and
  4. The Nigerian student advocacy (and Nigerian Student Express) is trending well.

 

Anything else interesting you can gather from the data that catches your eye?

 

Read More »

[Legal Rant] Addressing “Gaming the System” Concerns – Indian Study Permit Applicants and the SDS Example

 

Perhaps this post is inevitable. The reach of s.91 IRPA might be difficult to both control or manage from here inside Canada. Other than bulk refusing the applicants of certain unauthorized practitioners/agents that are found (and it appears only in egregious cases), it is highly unlikely we will see any enforcement. We will try and cut off the Canadian links, but too often they go under the surface – a quick Use of Rep signed here, a quick portal creation there, and the situation goes unnoticed by the applicant and our system-defending officers.

What is increasingly troublesome, however, is that there appear to be ways the system is being effectively gamed – or at least it is being marketed this way. As part of my work I am quite public on Twitter and Social media, and invite those with tips and leads to tell what is going on – ground level. This is one of those tips I received.

Here is one case that I think should trouble some folks.

Canada’s Student Direct Stream (SDS) has been lauded by many in the policy space for creating a subcategory of ‘ready-to-study’ students with good language scores, funds to pay for first year tuition, and a GIC. While I do not have recent stats and need to obtain them, the benefits of these programs have historically been faster processing and higher (10-12%) approval rates to reward students who did the leg work. One of the unique features of the SDS Application is that the approval (and refusal) is issued at Case Processing Centre (“CPC”) Edmonton in Canada, taking a large weight off the local visa offices and triaging cases more effectively.

I also want to give a bit of a context for writing this piece. A study permit was refused and the individual decided to go to an unauthorized representative for the subsequent follow-up application. That agent told the individual that previous counsel had provided too many explanations and letters and that the key to approval was to ensure the Visa Application Centre (“VAC”), and by extension the local visa office, could flag the file after submission. They recommended against submitting another SDS application.

Based on my credible source, who has canvassed other immigration agents from India, who confirmed same – the on the ground knowledge now is that SDS Applications will now take significant longer than regular applications and that to get approvals, the best thing to do is to get the office processed at the local level, at Delhi.

The way to do this is two fold:

  1. Make sure the language test done is the PTE – so that the file has to be processed locally in Delhi; and
  2. Make sure that only first semester and not first year tuition is paid to avoid SDS processing and keep the file local.

This is not the first time we have heard of a perceived preferential processing for non-SDS applications. Similarly for applicants from Philippines and Pakistan we have heard similar things in the past – along with Applicants that have taken various tips to try and get their cases triaged differently. These seem to be amplified concern of Applicants by the fact September school deadlines are starting and applicants need quicker decisions rendered than the SDS ones that have been taking several months. Agents are telling students (and apparently results are showing) approvals at the local office level.

RANT: I think we have to get to a point now, where we ask ourselves why we would create a stream like SDS only to have it take longer to process and perhaps offer less competitive processing.

Without the stats at this stage, I can only pass over the anecdotes I am hearing, but there is enough of a concern on the ground (I am not going to use the word qualitative this week – it doesn’t work) that applicants are being guided by unauthorized practitioners into ‘gaming the system.’ I believe it is enough of a concern that someone should step in to ensure transparency and proper communication.

Either there should be no discrepancy in processing times (thus removing the incentive of speed) or there should be a clear policy aim to have significantly higher approval rates for SDS than non-SDS streams, as should be the case on the basis of the required documentation to be submitted and obtained prior to application.

As IRCC moves to implement technological changes and institute these rules that will triage applications, it must be very aware of those who may have unauthorized access to or are learning how these rules work so as to want to circumvent them. If the data also comes out (beyond anecdotal) to support certain actions, applicants will adjust their behaviours and will be led to do so by unauthorized reps.

If SDS is the superstar program, worthy of global expansion, it is marketed as – there’s no reason it should take longer and make one’s applicant less likely to succeed. The doors of exploitation open up if there’s not consistency in this.

[End of Rant]

Read More »

Ocran v. Thavaratnam and Hoku: How a Chinook Decision is Bootstrapped in Judicial Review and Strategies to Counter

I am writing this post after noticing a troubling trend and pattern. Unfortunately, counsel who are unprepared for how the Department of Justice and IRCC work in tandem on these cases (and to be honest, some Judges have also fallen into the trap of this strategy) can lead to JRs being dismissed at leave.

Here’s the pattern:

  1. Temporary Resident Refusal – template refusal letter, further templated GCMS notes – these refusals are done on template refusal language. The refusal letter contains template grounds, and then the GCMS notes indicates further broad refusal grounds using such grounds as concerns about the applicant being ‘young, single, and mobile’, concerns about the ‘applicant’s family ties in Canada,’ the studies not being a ‘reasonable expense’, or the applicant’s ‘past mark sheets’ being of concern – among many others.
  2. Judicial Review – Applicant’s Record – Filed By Applicant – Usually on grounds that only conclusion statements are made, limited reference to material facts, speculative – therefore not intelligible, transparent or justifiable (Vavilov);
  3. Respondent’s Memorandum – Applicant is trying to relitigate facts – onus is on Applicant to prove they meet requirements, Officer’s not required to consider every fact, decision is reasonable. The DOJ then goes out of their way to usually add some detail or justification (I call this portion bootstrapping or counsel’s speculation) to tie together the pieces of the decision.
  4. Judge renders leave decision (refused) or grants hearing  but renders decision that may actually be seeking to question the Applicant’s efforts to provide evidence or question why not more was not provided – Judge often times stepping into the role of bootstrapping or speculating). Counsel tries to argue back that strong evidence was provided, and case becomes about relitigating and justification based on evidence before decision-maker (Applicant loses, often).

 

Assertions of Facts/Consideration of Evidence Leading to Conclusion or Just Conclusions With No Facts

If I wanted to refuse a study permit application reasonably, I could reasonably do it. Highlight some sort of shortcoming/mis-step/inadequacy of evidence, tie it somehow to a refusal ground, and lead to a conclusion that the Officer was not convinced.

However in this new Chinook world, the conclusions often come first and in fact (unintentional), there are little to no actual facts in decisions. This is because, as we know, decisions are being bulk refused and in other times Officers do not have enough time to properly suss out the facts short of stopping at the first grey area they see.

Very commonly decisions read like this:

The Applicant has a girlfriend in Canada. Therefore the applicant’s family ties and economic ties do not satisfy me they will leave Canada at the end of their authorized stay.

Unfortunately there is no logical leap step between Fact or Finding, or even if there is, there’s some obligation to consider evidence that might lead to an opposite finding, which rarely happens.

 

Reverse Engineering Decision – DOJ’s Position Supported by Ocran v. Canada (Citizenship of Immigration) 2022 FC 175 

I generally love his decisions – and his interpretation and application of Vavilov is top notch, but I would say Justice Little’s decision in Ocran is one where it went beyond a judicial review of the decision (para 24 onwards) to almost a stepping in the shoes of the Officer to re-evaluate the factual record in light of the sparse GCMS notes. In no part of the decision, does Justice Little actually address the flawed nature of analysis based on a ‘reading in’ of justifying evidence. In short, I think Ocran opens up (and maybe I read Vavilov the wrong way) to reverse engineering a refusal decision based on stated conclusions with limited factual reliance by the decision-maker.

The approach taken in Ocran has inspired the same process by Department of Justice in cases, and unsurprisingly the decision is being cited for that preposition now that the Record can be read into the sparse GCMS notes. The harm of this is that template language that never meant to analyze or apply the facts to a reached decision are now retroactively used to justify that decision-made.

While I celebrated (sort of) the decision of Justice Little not to break down/opine on the Chinook system, perhaps having sought to contextualize how Officers render their decision template decision using Chinook would have kept him from stepping in to provide as detailed of a factual analysis as he did.

As a side note, even more worrisome is that I have seen after judicial review (consent), a case go back to IRCC where the Officer refused again and did so by adding one line of fact (citing the Record) between each of their previously templated decisions. In short, it is not difficult to rewrite a Chinook decision to make it reasonable even if it was found unreasonable at first.

 

How to Counter – Thavaratnam v. Canada (MCI) 2022 FC 967 

On the more hopeful side of things, a recent decision by Madam Justice Furlanetto in Thavartnam gives applicant’s more hope.

In this decision, the Officer refused a temporary resident visa for an applicant from Sri Lanka, utilizing what Madam Justice Furlanetto refers to in para 19 as blanket or boilerplate statements and a series of conclusions (para 20).

She notes the gaps in reasoning from the Officer and the attempts of the DOJ to try and explain them, but concludes that this does not cure in inadequacy of the reasons for decision.

She writes:

[24] The Respondent proposes various explanations for the Officer’s conclusions. It asserts that the Applicant’s ties to Sri Lanka are weak when weighed against his family residing in Canada because only his wife is in Sri Lanka and they have no children. The Respondent asserts that the Applicant’s savings equate to $18,000 CAD and his pay for the year $5,000 CAD, which is extremely low by Canadian standards. It suggests that the business activities cannot be verified because they are training activities at a private organization owned by a relative. These explanations, however, were not those given in the GCMS notes. Counsel’s speculation of a plausible explanation cannot cure the inadequacy of the reasons for decision (Asong Alem v Canada (Citizenship and Immigration), 2010 FC 148 [Asong Alem] at para 19).

This is a paragraph that needs to be commonplace in responses where the DOJ seeks to try and take the reasons beyond what is written to piece together, what are gaps on the page. (Recall: Komolafe v Canada (Minister of Citizenship and Immigration), 2013 FC 431, 16 Imm. L.R. (4th) 267, at para 11.)

 

Contexualizing Bootstrapping for the Federal Court – Hoku v. Canada (Citizenship and Immigration), 2019 FC 362

I often start my contexualizing my reply’s with the DOJ’s practice. Again, Ilike to use the language of bootstrapping borrowing from wording of Justice Ahmed.

Respondent’s Position re: Family Ties Bootstraps the Officer’s Decision and Commits the Same Error as the Officer of Failing to Analyze Family Ties

3. With respect, the Respondent’s submissions regarding family ties bootstraps the decision of the Officer in this matter. Justice Ahmed writes in Hoku v. Canada (Citizenship and Immigration), 2019 FC 362 [“Hoku”] at para 13 about the practice of bootstrapping.

[13] The Applicant also submits that her detailed submissions and supporting documents were not considered. The Applicant explains that this evidence included her immigration history, personal background, bona fides about her spiritual healing, the nature of her criminal conviction, and indicia of rehabilitation. The Applicant points out that the Minister’s Policy Manual states that all of these factors must be considered, and argued that the Respondent’s submissions bootstrap the actual decision and the reasons discernable from the GCMS notes.

[14] The Respondent submits that the Applicant simply failed to establish that her circumstances justify issuing an ARC, which is not intended to routinely allow persons to overcome a deportation order (Andujo at para 26). The Respondent also submits that it is unclear if the Applicant explained to the Decision-Maker that she exited Canada to comply with her probation order and objects to any inclusion of information not before the Decision-Maker.

[15] First, I agree with the Applicant that the Respondent’s submissions bootstrap the actual Decision-Maker’s reasoning. For example, there is nothing to support the Respondent’s Memorandum at paragraph 16 which states that the Decision-Maker found that the Applicant’s reasons for requesting an ARC were not compelling.
(emphasis added)

Hoku at para 13.

I often then apply and highlight what the DOJ says and respond as follows:

The reality is that the Officer does not offer any justification for the templated reasons they have provided. The Officer merely states that the Applicant “is single, mobile, is not well established and has no dependents.” The Respondent is attempting to fill in the gaps on the page with their own analysis, which is not the purpose of judicial review and represents the very process of bootstrapping.

I hope this piece was helpful. Again, I love the judicial review practice and am excited to make an announcement in September (so soon!) about our further shift towards this direction and this work. I hope young counsel interested in the work slow down and do their research, before engaging DOJ on a Chinook refusal JR.

Read More »
About Us
Will Tao is an Award-Winning Canadian Immigration and Refugee Lawyer, Writer, and Policy Advisor based in Vancouver. Vancouver Immigration Blog is a public legal resource and social commentary.

Let’s Get in Touch

Translate »