Prediction and management of small-for-size syndrome in living donor liver transplantation
Article information
Abstract
Small-for-size syndrome (SFSS) remains a critical challenge in living donor liver transplantation (LDLT), characterized by graft insufficiency due to inadequate liver volume, leading to significant postoperative morbidity and mortality. As the global adoption of LDLT increases, the ability to predict and manage SFSS has become paramount in optimizing recipient outcomes. This review provides a comprehensive examination of the pathophysiology, risk factors, and strategies for managing SFSS across the pre-, intra-, and postoperative phases. The pathophysiology of SFSS has evolved from being solely volume-based to incorporating portal hemodynamics, now recognized as small-for-flow syndrome. Key risk factors include donor-related parameters like age and graft volume, recipient-related factors such as MELD score and portal hypertension, and intraoperative factors related to venous outflow and portal inflow modulation. Current strategies to mitigate SFSS include careful graft selection based on graft-to-recipient weight ratio and liver volumetry, surgical techniques to optimize portal hemodynamics, and novel interventions such as splenic artery ligation and hemiportocaval shunts. Pharmacological agents like somatostatin and terlipressin have also shown promise in modulating portal pressure. Advances in 3D imaging and artificial intelligence-based volumetry further aid in preoperative planning. This review emphasizes the importance of a multifaceted approach to prevent and manage SFSS, advocating for standardized definitions and grading systems. Through an integrated approach to surgical techniques, hemodynamic monitoring, and perioperative management, significant strides can be made in improving the outcomes of LDLT recipients. Further research is necessary to refine these strategies and expand the application of LDLT, especially in challenging cases involving small-for-size grafts.
INTRODUCTION
One of the primary challenges in liver resection is the risk of post-hepatectomy liver failure (PHLF), particularly when a significant portion of the liver must be removed. Although the safety profile of living donor liver transplantation (LDLT) has improved significantly over the past few decades, concerns about graft dysfunction due to small graft size persist. In LDLT, there is a delicate balance between ensuring donor safety and avoiding excessive liver resection, which could lead to liver failure. However, this caution may transfer risk to the recipient, as inadequate graft size can result in graft insufficiency, leading to increased morbidity and mortality. Determining the minimum acceptable graft size for recipients remains a topic of ongoing debate [1]. In 1996, Emond et al. [2] highlighted the clinical course of small-for-size transplants, describing significant functional impairment accompanied by paradoxical histologic changes indicative of ischemia. This observation led to the emergence of the concept of small-for-size syndrome (SFSS) in LDLT, characterized by clinical manifestations of graft insufficiency, including delayed cholestasis and increased ascites [2].
While the initial concept of SFSS primarily focused on liver graft size as the key determinant of recipient safety in LDLT, recent advancements have expanded this understanding to include the role of portal hemodynamics in relation to graft size, now recognized as small-for-flow syndrome [3]. Numerous studies have deepened our comprehension of the pathophysiology of SFSS, highlighting the critical importance of maintaining appropriate portal flow and pressure. Failure to achieve these conditions can lead to graft dysfunction, graft loss, and increased recipient mortality [4]. As the global use of LDLT continues to rise and experience with managing its complications grows, unresolved issues surrounding SFSS have prompted further discussion and research, culminating in a consensus conference held in Chennai, India, in 2023 [5].
In this comprehensive review, we will delve into the intricate pathophysiology of SFSS, exploring the underlying mechanisms that contribute to its development in LDLT. We will examine the key risk factors that predispose patients to SFSS, along with current methodologies for predicting its occurrence. Additionally, we will provide an in-depth analysis of the various strategies employed across the pre-operative, intra-operative, and post-operative phases to manage and mitigate the risk of SFSS. These strategies include optimizing graft selection, adjusting portal hemodynamics, and employing novel surgical and pharmacological interventions aimed at enhancing graft function and patient outcomes. Through this review, we aim to provide a thorough understanding of SFSS and offer practical insights into the best practices for its prevention and management in the context of LDLT.
DEFINING SFSS
The liver dysfunction observed after LDLT, primarily associated with the use of small liver grafts, has been described by various terms, including early allograft dysfunction (EAD) and primary allograft dysfunction (PAD) [6-8]. These conditions share clinical manifestations such as hyperbilirubinemia, coagulopathy, excessive ascites, and other signs resembling PHLF. EAD is characterized by suboptimal liver function in the immediate post-transplant period, often caused by factors such as ischemia-reperfusion injury, poor graft quality, or inadequate preservation [6,7]. PAD, on the other hand, represents a more severe form of graft dysfunction occurring within the first week post-transplant, leading to early graft failure due to severe ischemia-reperfusion injury, technical complications, or significant preservation injury [8]. Pomposelli et al. [6] and Okamura et al. [7] defined EAD as having a total serum bilirubin level of >10 mg/dL or an INR of >1.6 on day 7, both of which are associated with a high risk of graft loss or mortality. Similarly, Ikegami et al. [8] characterized primary graft dysfunction as severe graft dysfunction post-LDLT, with a total serum bilirubin level of >20 mg/dL and an INR of >2 after day 7, correlating this with a 77% rate of graft loss.
Soejima et al. [9] first described SFSS as a condition where patients developed serum total bilirubin levels >5 mg/dL or produced >1 L of ascites by day 14 post-LDLT, or >500 mL/day by day 28. In 2005, Dahm et al. [10] refined this definition, describing small-for-size dysfunction as graft dysfunction in a “small” partial liver graft (graft-to-recipient weight ratio [GRWR] <0.8%) within the first postoperative week, after excluding other causes such as technical, immunological, or infectious complications. Graft dysfunction was specifically defined as the presence of two or more of the following on three consecutive days: bilirubin >100 μmol/L, international normalized ratio (INR) >2, and encephalopathy grade 3 or 4. Graft failure was identified when dysfunction necessitated retransplantation or resulted in the recipient’s death. However, it’s important to note that Dahm et al.’s study was based on a literature review rather than actual patient data. A comparison of these definitions is provided in Table 1.
Due to the varied definitions and terminology used to describe the similar clinical phenomenon of liver dysfunction following LDLT with partial liver grafts, an international consensus has recommended standardizing the term “SFSS” to describe this condition [11]. SFSS is now defined as a clinical syndrome resulting from a partial liver graft that is insufficient to meet the metabolic demands of the recipient in the absence of specific surgical or nonsurgical complications [11]. Table 2 outlines the differences between the various published descriptions and definitions of SFSS.
PARADIGM SHIFT FROM SIZE TO FLOW IN SFSS
SFSS after LDLT and ‘‘post-hepatectomy liver failure’’ are traditionally linked to liver failure due to excessive reduction in liver mass. Clinical symptoms include hyperbilirubinemia, coagulopathy, encephalopathy, and refractory ascites. Historically, the safe threshold for liver resection or transplantation has been determined by the graft-to-body weight (BW) ratio, with a ratio above 0.8 considered safe. However, this size-based approach has proven unreliable, as some patients develop SFSS despite meeting the ‘‘safe’’ threshold, while others who exceed it do not [3]. Shifting the focus from ‘‘size’’ to ‘‘flow’’ could enhance our understanding of SFSS, with significant implications for clinical management. In addition to liver volumetry, the functional quality of the liver parenchyma, particularly the future liver remnant (FLR), should be evaluated before liver resection. During surgery, hepatic hemodynamic parameters should be closely monitored, as they provide a more accurate basis for determining the ‘‘safe’’ threshold of viable liver parenchyma. Unlike liver mass, hepatic hemodynamic parameters can be manipulated, allowing for adjustments to the ‘‘safe’’ threshold based on real-time conditions. This shift from a ‘‘small-for-size’’ to a ‘‘small-for-flow’’ paradigm represents a major advancement in optimizing donor liver use, expanding surgical indications, and increasing the safety of hepatic surgeries (Table 3).
NEW GRADING SYSTEM IN SFSS
At the 2023 consensus conference organized by ILTS-iLDLT-LTSI in Chennai, a refined definition of SFSS was introduced, shifting the emphasis from “size” to “flow” and incorporating a new grading system [11]. This grading system is designed to provide an objective, evidence-based approach to classifying the severity of SFSS into three grades: A, B, and C. The system includes detailed criteria for stratifying the severity of SFSS and outlines the associated risks of adverse outcomes based on these grades. The goal of this grading system is to establish a standardized language and framework that the international liver transplantation (LT) community can use for clinical practice, research, and education related to SFSS in LDLT (Table 2).
SHEARING STRESS ON SMALL LIVER POST RESECTION OR SMALL LIVER GRAFT IN LDLT (Fig. 1)

Pathophysiology of SFSS in LDLT. SFSS, small for size syndrome; LDLT, living donor liver transplantation.
In SFSS, the primary mechanisms of injury include increased portal flow and hyperperfusion in the small liver graft, leading to reduced hepatic arterial inflow, elevated portal pressure, and sinusoidal resistance, which contribute to biliary injury and dysfunction (Fig. 1). Histopathological signs include cholestasis, hepatocyte ballooning, mitochondrial swelling, and areas of ischemic necrosis [12]. These mechanical injuries are compounded by inflammatory and immune responses, as well as ischemia-reperfusion injury.
Following extended liver resection or reperfusion of a liver graft, shearing stress from elevated portal pressure can lead to perisinusoidal and periportal hemorrhage, followed by arterial vasoconstriction and ischemic cholangitis [13]. The “hepatic arterio-portal buffer” phenomenon, observed in both experimental and clinical studies, shows that increased portal blood flow relative to liver weight results in an inverse relationship between portal and arterial blood flows, leading to ischemic necrosis and biliary injuries [14-16].
Several studies have highlighted the impact of portal and hepatic arterial blood flow on the development of SFSS during LDLT [17-19]. Jiang et al. [19] found that portal blood flow exceeding 300 mL/min/100 g significantly increases the risk of SFSS. Troisi et al. [20] reported that constructing a portal-systemic shunt when portal blood flow exceeded 250 mL/min/100 g in grafts with a GWRW <0.8 helped prevent SFSS and improved survival outcomes [18,20].
Similarly, Boillot et al. [21] demonstrated that mesocaval shunts in LDLT with small grafts have beneficial effects. Fondevila et al. [16], using porcine models, suggested that increased portal flow stimulates both hepatic regeneration and sinusoidal damage, advocating for portocaval anastomosis as a preventive measure against SFSS [16,22]. Surgical techniques that reduce portal vein flow (PVF) and pressure while increasing hepatic artery (HA) flow can prevent small-for-size flow syndrome after major hepatic resections [23]. Increased portal flow to a small remnant liver causes sinusoidal congestion, endothelial damage, and hepatocyte injury. Elevated portal pressures also reduce HA flow, leading to ischemic biliary injury and cholangitis, ultimately impairing liver regeneration.
Portal pressure is a critical factor in predicting graft failure. Yagi et al. [24] found that portal pressure >20 mm Hg is associated with ascites, coagulopathy, hyperbilirubinemia, and early graft hypertrophy, with increased hepatocyte growth factor and reduced vascular epithelial growth factor levels indicating an impact on liver regeneration. Kaido et al. [25] demonstrated that controlling portal pressure below 15 mm Hg in small grafts (GRWR of 0.6) resulted in survival rates similar to standard-sized grafts and reduced donor complications.
This evidence underscores the importance of achieving optimal portal blood flow and pressure during LDLT to prevent SFSS. Strategies such as portocaval anastomosis [20,26], splenic artery ligation (SAL) [27-29], and somatostatin (SST) administration [30] can modulate portal inflow and improve outcomes. The evolution of volume and flow modulation strategies in mitigating SFSS will be further discussed in the section on SFSS management, including the use of portal inflow modulation (PIM) in LDLT.
RISK FACTORS FOR SFSS IN LDLT
The development of SFSS in LDLT is influenced by three major factors: donor selection, recipient selection, and portal hemodynamics and its modulation (to be discussed in the subsequent section) (Fig. 2). Optimizing each step in this process is crucial to ensuring positive outcomes.
DONOR RISK FACTORS
Although donor age alone has not been directly linked to SFSS, concerns arise with older living donor liver grafts due to their lower regenerative potential and higher parenchymal resistance, which increase risks for both donor and recipient [31-33]. Studies indicate that grafts from donors over 45 years old are at higher risk for SFSS and have inferior graft survival, particularly when combined with factors like steatosis, lower GRWR, high-acuity recipients, intraoperative portal venous pressure (PVP) >19 mm Hg, and ABO incompatibility (ABO-i) [6,34-38]. Older grafts also show poorer tolerance to PVP >15 mm Hg, complicating PIM [39].
While some LDLT centers exclude donors with high body mass index (BMI), the degree of liver steatosis is a more critical factor. Research by the Toronto group found that using grafts with macrosteatosis <10% from donors with BMI >30 kg/m2 had no negative impact on LDLT outcomes [40]. This suggests that high BMI alone does not increase SFSS risk, making the assessment of liver macrosteatosis essential. Most centers now rely on non-invasive radiological studies, such as the computed tomography (CT) liver attenuation index or magnetic resonance imaging (MRI)-based proton density fat fraction, to assess steatosis [41-43]. Percutaneous biopsy, once standard, is now less common due to its invasiveness and variability in results. Generally, a macrosteatosis cutoff of 10% is used for right lobe (RL) liver donation, as higher levels are associated with increased risk of EAD and SFSS, due to factors like inflammatory cytokine release and poor tolerance to ischemia-reperfusion injury [34,42,44-46]. While grafts with up to 20% macrosteatosis are not absolutely contraindicated, they are not recommended, especially when combined with a small-for-size graft (SFSG), due to the significantly increased risk of SFSS. Weight loss interventions can help reduce macrosteatosis, thereby lowering risks for both donors and recipients [47-49]. A randomized controlled trial (RCT) further supports this approach, showing improved donor liver regeneration and decreased EAD in recipients following pre-surgery interventions like a low-calorie diet and exercise regimen [50].
GRAFT TYPE CONSIDERATIONS
The choice between right lobe grafts (RLG) and left lobe grafts (LLG) in LDLT involves balancing donor morbidity risks with recipient outcomes. Although LLG are often suggested for enhancing donor safety, particularly after poor outcomes with RLG at some centers, major studies have shown no significant difference in donor morbidity between RL and LL grafts [51-54]. This has prevented a “left-shift” in preference, as RLG generally result in fewer technical challenges, fewer vascular complications, better regeneration, and improved graft and patient survival in both the short and long term. A meta-analysis of 25,230 donors even indicated that RL LDLT recipients are less likely to develop SFSS [55].
RECIPIENT RISK FACTORS
Several factors can influence the development of SFSS in LDLT recipients, but the relationship between these factors and SFSS is complex and not always clear-cut. While some early studies suggested that older recipient age might increase SFSS risk [37,56]. More comprehensive reviews and analyses have not consistently supported this link [34,35,57]. Therefore, recipient age alone should not be considered a determining factor for SFSS. Similarly, ABO-i has not been shown to significantly affect SFSS risk. A meta-analysis of 12 comparative studies found no increased risk of SFSS associated with ABO-I [58], although it may contribute to risk when combined with other factors such as donor age or recipient acuity.
Although univariate analyses in some studies have shown higher SFSS incidence in patients with cholestatic liver disease [38] and hepatocellular carcinoma recipients with BMI ≤30 kg/m2 [45], these associations did not hold up in multivariate analyses. Thus, there appears to be no direct correlation between the underlying liver disease and the risk of SFSS.
However, it is generally accepted that higher model for end‐stage liver disease (MELD) scores may predispose patients to SFSS if the graft is too small to meet metabolic demands. Several studies have reported an association between preoperative MELD scores and SFSS risk [34,56,59-61]. Specifically, MELD scores above 19 [35,46,56] or 26 [62] have been identified as significant predictors of SFSS. Consequently, many centers opt for larger grafts in patients with high MELD scores to mitigate this risk. For example, Alim et al. [63] suggested that grafts with a GRWR >0.8 are suitable for recipients with MELD scores >20, but this threshold can be lowered to 0.6 in lower MELD score patients, younger donors (<45 years), and absence of liver macrosteatosis.
Portal hypertension is also suspected to contribute to SFSS by leading to portal hyperperfusion, which damages SFSG [34,35,45,46]. However, the evidence is not conclusive. MELD scores and portal hypertension, therefore, increase SFSS risk primarily when combined with an undersized graft.
In acute liver failure (ALF) patients, larger grafts are typically preferred to meet metabolic demands, which generally ensures better outcomes post-LDLT. Although MELD scores may be high in ALF patients, PVP is often only mildly elevated, resulting in a lower-than-expected SFSS risk. Indeed, large series from Campsen et al. [64] and Pamecha et al. [65] reported no increased incidence of SFSS in ALF patients undergoing LDLT. In contrast, acute-on-chronic liver failure (ACLF) patients with background cirrhosis often present with high PVP, making them more susceptible to SFSS, particularly in cases involving SFSG [45]. Additionally, sepsis and systemic inflammatory response syndrome in ACLF patients further elevate SFSS risk due to increased metabolic demands [61,66,67].
Lastly, optimizing venous outflow in LDLT grafts is critical to preventing congestion and maximizing graft function [68,69]. There is currently no evidence to suggest that multiple arteries, portal veins (PV), or bile ducts (BD) are specifically associated with increased SFSS risk.
ADDITIONAL CONSIDERATIONS FOR UNRESOLVED ISSUES
Sarcopenia significantly increases the risk of morbidity and mortality both before and after LT, particularly in LDLT recipients. Sometimes, low MELD scores in sarcopenic patients may mask the true severity of their condition, thereby elevating the risk of mortality and sepsis post-transplant [70,71]. Pravisani et al. [72] demonstrated that pre-LDLT sarcopenia is associated with a decreased graft regeneration rate, with this negative impact being more pronounced in male patients. Although sarcopenia is linked to poorer outcomes in LT, including LDLT, objectively quantifying the degree of sarcopenia, frailty, and malnutrition remains challenging. It is evident that sarcopenic patients have higher metabolic demands and may require larger graft sizes, but there is a lack of literature to guide optimal graft size selection in this population [73-76].
Additionally, managing large amounts of abdominal ascites in the pre-LT setting poses challenges in estimating the appropriate graft size for a safe LDLT. Some authors recommend calculating the GRWR using the patient’s dry weight closest to the time of surgery, but this approach can be unreliable due to the difficulty in accurately measuring ascites volume. Consequently, estimating the appropriate graft size may lack precision. Moreover, other factors such as MELD score and renal function, particularly in hepatorenal syndrome, can directly influence ascites production, further complicating graft size estimation [77].
STRATEGIES IN MANAGING SMALL FOR SIZE SYNDROME IN LDLT
To achieve optimal outcomes in LDLT, several fundamental considerations are essential. These include ensuring optimal venous outflow, performing precise PV and HA anastomoses, and ensuring safe BD reconstruction. As such, careful pre-operative selection of a graft that is both the right size and anatomically suitable is critical for minimizing the risk of SFSS. Beyond these core principles, strategies for managing SFSS can be implemented at various stages of the treatment process—pre-operatively, intra-operatively, and post-operatively (Fig. 3).

Causes of SFSS and strategies for managing them based on pre-operative, intra-operative and post-operative factors. SFSS, small for size syndrome; LDLT, living donor liver transplantation; MELD, modified end stage liver disease; GRWR, graft-to-recipient weight ratio.
Pre-operative strategies
As discussed, sarcopenia is associated with increased morbidity and mortality before and after LT, particularly in LDLT recipients. Pre-LT interventions, such as optimizing nutrition, controlling sepsis risk, and implementing prehabilitation programs, are essential. The North American Expert Opinion Statement on Sarcopenia in LT recommends using sarcopenia to predict LT prognosis [78]. A meta-analysis by Jiang et al. [75] found that preoperative sarcopenia correlates with longer intensive-care-unit stays, higher rates of sepsis, and increased post-LT complications. Moreover, in sarcopenic patients, BW-derived standard liver volume (SLV) formulas are more prone to underestimating SLV, thereby increasing the risk of post-LT SFSS [79].
Prehabilitation has shown promise in improving physical fitness and outcomes for LT candidates, particularly sarcopenic patients or those receiving smaller liver grafts, as it helps optimize graft regeneration and overall transplant outcomes. A systematic review by Jetten et al. [80] included eight studies with 1,094 patients, demonstrating significant improvements in VO2 peak, 6-minute walking distance, hand grip strength, liver frailty index, and quality of life [80]. The ILTS ERAS4OLT.org working group supports prehabilitation, citing potential short-term functional benefits [81]. However, data on postoperative outcomes remain limited, with ongoing RCTs aiming to address this gap [82-84].
Central to LDLT donor evaluation is liver volumetry, which determines whether the graft size is suitable for donation and sufficient for the recipient’s post-LT needs. The principle of “double-equipoise” must guide donor selection, ensuring that the donor’s risks are justified by the recipient’s benefits. Imaging studies, including CT and MRI, are used for liver volumetry, surgical planning, and assessing crucial structures. Advances in 3D reconstruction and artificial intelligence (AI)-enabled software now provide highly accurate liver volume measurements and vessel segmentation [85].
The GRWR remains the most commonly used parameter for estimating required graft size. Most centers recommend a GRWR ≥0.8% and/or graft volume to standard liver volume (GV/SLV) ≥40% as the safety cutoff, with SFSS rates under 10% when these thresholds are met [11]. Some centers accept a GRWR as low as 0.6% with PIM, particularly in cases with low MELD scores or non-cirrhotic hepatocellular carcinoma without significant portal hypertension. However, adherence to GRWR ≥0.8% is advised, especially for small-volume centers or new LT programs, to avoid an increased risk of SFSS and poor outcomes [59,62,86-90].
SLV calculation formulas typically use anthropometric measurements, such as BW, height, and body surface area (Table 4) [91-106], though Kokudo et al. [105] introduced a method using thoracic width via CT scan. When calculating GRWR, actual BW should be used to better estimate the recipient’s metabolic demand, and estimated ascites volume should be included for accuracy [77].
Intra-operative strategies
Venous outflow reconstruction
Once a suitable graft with the right size and anatomy has been identified for LDLT, the focus shifts to ensuring optimal surgical techniques to achieve proper inflow and outflow of the graft. In LDLT, hepatic venous outflow obstruction is a critical factor in the development of SFSS, making venous outflow reconstruction one of the most challenging aspects of the procedure. The decision to include or exclude the middle hepatic vein (MHV), especially in RLG, and whether to perform venous outflow reconstruction is crucial to avoid postoperative complications related to graft congestion and SFSS.
Currently, there are no standardized guidelines on managing the MHV in LDLT, so decisions are typically made on a case-by-case basis following a thorough pre-operative assessment of the hepatic venous anatomy of both donors and recipients, with particular attention to the drainage of the anterior sector. Reports indicate that LDLT can be safely performed with or without the MHV [107-109]. Generally, if the FLR is ≥30%, the GRWR is ≥0.8, and segment IV drainage is minor, including the MHV in the graft can be safely performed. Conversely, if segment IV contributes significantly to the MHV, a RLG without the MHV should be considered. If the MHV is excluded, venous drainage reconstruction is essential for segment V and VIII tributary veins larger than 4–5 mm to prevent graft congestion [107,108]. Tan et al. [107] suggest that reconstruction of segments V and VIII may not be necessary in grafts without the MHV when the GRWR is ≥1.0, particularly in those with a dominant right hepatic vein.
Various methods are available for venous reconstruction in LDLT grafts. For right liver grafts, approaches can be divided into those with or without the MHV. When including the MHV, the donor hepatectomy typically follows a transection plane along the left side of the MHV, transitioning to the right once adequate exposure is achieved [107]. For grafts without the MHV, a more straightforward approach follows Cantlie’s line, with large segment V and VIII veins carefully preserved and clipped for later reconstruction on the backtable [107]. Venous reconstruction can be performed using several types of interposition grafts, including allogenic vascular grafts, cryopreserved cadaveric iliac grafts, or synthetic polytetrafluoroethylene grafts [107,109,110]. Other options include explanted PV and inferior mesenteric vein grafts [110], as well as the use of peritoneal patches for MHV reconstruction, which has shown reasonable outcomes in LDLT [111,112]. Ultimately, maximizing the potential of all venous outflows is crucial for the success of LDLT.
Graft differences: left vs. right lobes
While most adult-to-adult LDLT procedures use RLG due to their larger graft volume, there is concern that RLGs may increase donor morbidity. This has led some centers, especially those with poor outcomes from RLGs, to favor LLG to reduce donor risk. However, the use of smaller LLGs is associated with a higher risk of SFSS in recipients. To mitigate this risk, PIM techniques and other surgical measures have made LLGs more feasible.
Halazun et al. [113] reported one of the largest North American series of adult LDLT using LLGs, finding comparable recipient outcomes between LLG and RLG. Among 216 adult LDLT recipients, SFSS occurred in 5.4% of LLG recipients (n=3) but in none of the RLG recipients (P=0.003). Despite this, graft and patient survival rates were similar between the groups. Importantly, selective portal flow modulation was performed in cases with GRWR <0.8 or severe portal hypertension, regardless of graft weight [113]. Fujiki et al. [114] also found that portal flow modulation is crucial for improving outcomes in LLG LDLT, particularly in cases with small graft size, in their multicenter analysis of 130 adult LDLT recipients. Overall, current evidence suggests that both RLG and LLG can be safely used without increasing SFSS risk, provided that surgical portal modulation techniques are employed when necessary.
Dual graft LDLT
Given that up to one-third of donor candidates may be unsuitable due to factors such as hepatic steatosis, small FLR, or low estimated GRWR, dual graft LDLT—using both RLG and LLG—was developed to meet the recipient’s GRWR requirements [115]. Lee et al. [116] performed the first dual LLG LDLT in 2001 to address SFSS and ensure donor safety. However, dual graft LDLT is technically complex and demanding. High-volume LDLT centers have reported successful outcomes with this approach [117-119], but it should be undertaken only at such centers due to the high level of technical expertise required for success.
Flow measurements
As noted earlier, portal flow measurements are now strongly recommended during LDLT to optimize graft function and reduce the risk of SFSS. Understanding portal hemodynamics, particularly in managing portal hyperperfusion, is crucial for improving graft outcomes. The ILTS practice guidelines published in 2017 emphasize the importance of intraoperative hemodynamic monitoring, including the measurement of portal pressure, as well as arterial and venous portal flow [120]. If portal pressure exceeds 20 mmHg, PIM is advised to mitigate the risk of adverse outcomes [120]. Additionally, a PVF of 250 mL/min/100 g of graft is currently recommended as a surgical management cutoff [57].
During the recipient’s surgery, baseline portal pressure measurements should be taken during the dissection phase. After completing vascular anastomoses and graft reperfusion, comprehensive hemodynamic monitoring, including portal and arterial flow and portal pressure, is essential. Special attention is required for cases with a GRWR <0.80%, as PIM must be tailored based on the severity of PV pressure. Soin et al. [89] detailed an institutional protocol where PIM is not necessary if portal pressure is ≤15 mmHg during dissection or if portal pressure exceeds 15 mmHg but the GRWR is ≥0.80%. For those with portal pressure >15 mmHg and a low GRWR, specific PIM strategies are recommended: a) GRWR <0.70%: hemiportocaval shunts (HPCS); b) GRWR 0.70–0.74%: HPCS for portal pressure >18 mmHg or SAL if between 15–18 mmHg; c) GRWR 0.75–0.79%: SAL [89]. This approach, which combines two accessible measurements, has proven to be a feasible and reliable guide for surgeons, extending beyond traditional practices.
Another key hemodynamic parameter is the liver graft-to-spleen volume ratio (GSVR), which has been proposed as a predictive tool for portal hypertension and SFSS [121,122]. A low GSVR has been linked to higher PVF post-transplant and poorer postoperative outcomes, including increased ascites, prolonged drainage, post-transplant thrombocytopenia, and overall worse prognosis compared to a normal GSVR [122,123]. However, there is inconsistency in defining the GSVR cutoff value, with suggested thresholds ranging from <0.6 to <1.03 g/mL [121-123]. Further research is needed to establish a standardized GSVR cutoff for clinical use.
PIM
To mitigate the effects of high portal pressure on smaller grafts in LDLT, a range of interventions collectively known as PIM have been developed, employing pharmacological, radiological, and surgical strategies [124-127]. Intra-operatively, when portal flow measurements are elevated, pharmacological and surgical PIM can be used as preventive measures to reduce the risk of post-transplant SFSS. Radiological PIM, on the other hand, is typically employed as a salvage measure in the post-transplant setting, when SFSS has already developed [128,129]. Given that surgical PIM is a relatively new approach, there is limited evidence on its efficacy in preventing SFSS and allograft dysfunction [125]. Preferred PIM techniques vary among centers, with SAL/embolization, splenectomy, and portosystemic shunting (e.g., HPCS) being the main surgical strategies recommended to reduce the risk or treat SFSS [120]. Table 5 summarizes the key surgical PIM strategies used to decrease the incidence of SFSS.
SAL
SAL has been shown to reduce PVP while simultaneously increasing HA flow [18,128,130-132]. This reduction in pressure is achieved by decreasing splenic outflow, and SAL is also used to counteract hepatic arterial hypoperfusion (<100 mL/min) by reducing the “splenic steal” effect. Proximal SAL is simple, easy to perform, and associated with low morbidity, making it the recommended first-line surgical PIM according to the 2023 ILTS consensus guidelines [5]. However, the reduction in PVP with SAL alone may be modest and temporary, leading some centers to combine it with splenectomy for a greater and more sustained decrease in PVP [133,134].
Splenectomy
Splenectomy is increasingly used by some LDLT centers to reduce PVP and PVF [32,36,37,39,135-144]. This procedure results in a significant reduction in PVF, as the spleen can account for up to 52% of total portal blood flow. However, splenectomy carries risks such as bleeding, splenic vein and PV thrombosis, septic complications, and pancreatic leak, which have limited its widespread adoption as a PIM modality. Despite these risks, some high-volume LDLT centers continue to use splenectomy as their primary surgical PIM technique to effectively reduce portal flow.
Splenic devascularization (SDV)
SDV is an alternative approach for PIM that preserves splenic function by reducing the splenic contribution to portal venous inflow while maintaining the spleen’s anatomical and immunological integrity [141]. By interrupting the arterial supply or venous outflow, SDV alleviates sinusoidal congestion and balances portal inflow with the graft’s drainage capacity, protecting the liver graft. Unlike splenectomy, SDV avoids the risks of total splenic removal, such as immunosuppression and infection, but it may not achieve as robust a reduction in portal inflow particularly in cases of severe portal hypertension, potentially leaving residual portal hypertension. Risks include ischemia, splenic infarction, and splenic vein or PV thrombosis, which can adversely affect graft recovery, making SDV more suitable for milder cases or specific populations, such as pediatric recipients or those at high risk of infections. SDV can be employed in conjunction with adjunctive measures like portocaval shunts or pharmacological agents for optimal outcomes in SFSG.
HPCS
HPCS diverts a substantial portion of portal flow away from the partial liver graft, rapidly decreasing PVP and PVF [20,89,132]. However, this significant reduction in portal pressure can lead to graft hypoperfusion and “steal” phenomena, potentially causing graft dysfunction. The technical aspects of HPCS, such as the choice of conduit, size, timing, and whether to prophylactically close the shunt after graft regeneration, have yet to be standardized. Botha et al. [145] described the creation of HPCS by directly connecting the left or right branch of the PV to the inferior vena cava or using a conduit (e.g., recipient PV, cryopreserved veins, or synthetic grafts). Unlike autologous grafts, synthetic grafts with fixed diameters can ensure that the diversion remains within predefined limits. Unresolved issues with HPCS include determining the optimal timing for its creation and deciding whether the shunt should be closed in the long term to prevent ongoing portal steal from the graft and complications such as recurrent hepatic encephalopathy and liver atrophy.
Post-operative strategies
In the post-LDLT setting, it is crucial to implement measures that minimize the risk of developing SFSS, particularly in recipients with small liver grafts. However, even with seemingly adequate graft sizes, SFSS can still occur. Identifying potential SFSS requires a high index of suspicion. The newly proposed grading system and recommended surveillance regime for post-LDLT patients provide valuable guidance [11]. In Grade A SFSS, medical therapy, including pharmacological agents, should be initiated. If the condition progresses to Grade B, characterized by portal hypertension, surgical or radiologic intervention should be considered. In Grade C, the liver failure phase of SFSS, in addition to surgical or radiologic interventions, liver re-transplantation must be actively explored [145].
For high-risk patients, initiating pharmacological agents for PIM should be considered. Selected pharmacological agents have been employed both intraoperatively and postoperatively for portal flow modulation in LDLT, including SST, terlipressin, beta-blockers, and others. SST and its synthetic analogue, octreotide, traditionally used for variceal bleeding in cirrhosis, are also effective in reducing PVP by inducing sinusoidal dilation and mitigating parenchymal injury from microcirculatory shear force [144]. Rhaiem et al. [146] demonstrated that intraoperative initiation of SST therapy, followed by a continuous infusion of 250 µg/h over five days, effectively modulated portal inflow and significantly reduced PHLF. Similarly, other studies have demonstrated that intraoperative and postoperative administration of SST reduces PVP in LT, thereby mitigating SFSS. Troisi et al. [147] reported a significant reduction in PVF and hepatic venous pressure gradient in LT recipients treated with an intraoperative SST bolus followed by a five-day infusion, with no differences in adverse events or long-term complications compared to controls. Jo et al. [148] reported positive outcomes with a continuous SST infusion for a median of seven days post-LDLT, with no long-term liver dysfunction or SFSS-related mortality. Additionally, SST has been effectively combined with beta-blockers in post-LDLT settings [149].
Beta-blockers can also effectively reduce portal hypertension by decreasing cardiac output via β1 blockade and reducing PVF via β2 blockade. Nonselective beta-blockers are particularly effective for variceal bleeding prophylaxis [150]. Busani et al. [151] combined octreotide with the beta-blocker esmolol (a selective β1 adrenergic agonist) in LDLT recipients, demonstrating a decrease in PVF compared to baseline measurements, further supporting the role of combined pharmacological strategies.
Terlipressin, a vasopressin analogue, acts by inducing splanchnic vasoconstriction, thereby reducing portal flow [152]. Mukhtar et al. [153] demonstrated that terlipressin effectively decreased PVP, increased mean arterial pressure, improved renal function, and reduced ascites within four days postoperatively. Similarly, Reddy et al. [127] observed reduced postoperative renal injury in the intervention group, although they did not find a significant difference in intraoperative PVP.
When administered intraoperatively and continued postoperatively for 48–72 hours, terlipressin can effectively lower portal pressures within an hour of administration, with sustained effects throughout the infusion period [153-155]. However, its use requires caution due to the risk of severe vasoconstriction, which may lead to dose-dependent adverse effects, including angina, end-organ ischemia, and metabolic dysregulation. The data on terlipressin’s impact on portal hemodynamics remains inconsistent. For example, an RCT by Reddy et al. [127] reported no significant reduction in post-reperfusion portal pressures but noted decreased portal flow velocities. Likewise, Karaaslan and Sevmis [156], in a retrospective analysis, found no significant benefits of terlipressin on portal hemodynamics. Despite these conflicting findings, most studies highlight terlipressin’s beneficial effects on renal function in liver transplant recipients. These benefits are evidenced by reductions in serum urea and creatinine levels and increased urine output [127,153-155,157].
Prostaglandin E1 (PGE1) has also been studied for its role in PIM. Suehiro et al. [158] showed that a seven-day intraportal administration of PGE1 following LDLT was associated with a lower incidence of SFSS (3.4% vs. 25.4%), as well as reduced ascites output and serum bilirubin levels two weeks postoperatively. PGE1 and prostacyclin prevent congestion in smaller grafts through their vasodilatory effects on hepatic circulation [152,159-161]. Onoe et al. [159] reported better recovery from hyperammonemia and hyperbilirubinemia, along with improved survival in recipients with SFSG who received intraportal PGE1 infusion for one week. Bärthel et al. [162] conducted a pilot study using a PGI2 analogue, iloprost, for seven days following DDLT, noting a trend toward a lower rate of primary graft dysfunction in the iloprost group.
While studies on pharmacological PIM for managing portal hyperperfusion are diverse, one clear advantage of pharmacological agents is their ability to be rapidly reversed and incrementally adjusted. Compared to surgical or radiologic intervention, pharmacological approaches are easier to manage, although the evidence base is still in its early stages.
In addition to pharmacological agents, more aggressive interventions to modulate portal inflow—such as splenic artery embolization (SAE) or surgical options like splenectomy—can be effective in reducing portal hyperperfusion and improving hepatic arterial flow in LDLT recipients. These interventions work on the same principle as SAL, aiming to decrease PVP and prevent SFSS. Currently, seven studies have been published on this topic, with five retrospective series showing highly promising outcomes, particularly in reducing PV velocities. In most cases (37 out of 38), patients experienced resolution of ascites/hydrothorax and hyperbilirubinemia, although responses in INR were variable across studies [163-169]. For patients who did not respond to SAE or SAL, additional interventions, including transjugular intrahepatic portosystemic shunt, completion SAE following failed SAL, and liver re-transplantation, were performed [165-167]. Common post-embolization syndromes following SAE/SAL include pain, fever, pancreatitis, splenic infarction, and abscess formation [163,165,166]. Nontransplant literature suggests that proximal SAE is preferable due to its lower complication rate compared to distal SAE [170].
In suspected cases of SFSS, it is critical to rule out other mechanical causes, such as PV stenosis and hepatic vein outflow obstruction [165,166]. Imaging studies, coupled with measurements of PVP and hepatic venous pressure gradient, can help determine if these factors are contributing to graft dysfunction. If the portal system’s pressure gradient exceeds 5 mm Hg or the hepatic venous/inferior vena cava gradient exceeds 3 mm Hg, radiologic or surgical intervention may be warranted to resolve the obstruction.
Supportive care is essential in managing SFSS to optimize graft regeneration and recovery. Key aspects of medical management include infection prevention, immunosuppressant optimization, and maintaining proper nutrition and fluid balance. Notably, LDLT recipients may require lower doses of immunosuppressants due to the altered metabolism of smaller grafts, which can impact drug levels and the liver’s regenerative process [171]. Subtherapeutic immunosuppression can lead to acute cellular rejection, inflammation, increased graft stiffness, and poor regeneration, which are particularly detrimental to small grafts. Liu et al. [172] found that tacrolimus dosages were lower for two months in LDLT recipients with GRWR <0.8% compared to those with GRWR ≥0.8%, while maintaining similar drug levels. Another study indicated that donor age and smaller GV/SLV were associated with a prolonged half-life of tacrolimus following LDLT [173]. Therefore, maintaining adequate immunosuppression in SFSS patients is critical, with consideration given to the altered metabolic capacity of the regenerating graft.
Infection prevention remains vital, although there are no specific data on antibiotic prophylaxis in SFSS. Standard postoperative antibiotic recommendations for liver recipients should be followed, and rifaximin may be added in cases of hyperammonemia [174,175]. Managing ascites is another key aspect of SFSS care, although no standard recommendations exist. The primary strategies involve balancing sodium intake and replacing plasma volume according to the amount of ascitic fluid drained. Patients with high-output ascitic drainage face significant challenges in maintaining volume and sodium homeostasis, as the sodium content in the drainage fluid is similar to plasma, leading to rapid intravascular hypovolemia, renal dysfunction, and hyponatremia. Replacing fluid loss with 5% albumin can prevent hypovolemia and renal dysfunction [176]. In this context, sodium and water restriction and diuretics are not appropriate, as ongoing sodium loss necessitates careful fluid management akin to continuous paracentesis.
CONCLUSION
SFSS is a critical consideration in LDLT. The consensus established by the ILTS-iLDLT-LTSI has standardized the definition of SFSS and introduced a new grading system to stratify its severity, providing the liver transplant community with a unified language to document and predict SFSS. This has been complemented by a strategic, evidence-based approach for managing SFSS, incorporating both surgical and pharmacological PIM techniques. Key to preventing SFSS is ensuring an adequate liver graft size and employing optimal surgical techniques during graft anastomosis. Additionally, comprehensive peri-operative management, including infection control, immunosuppressant optimization, and maintaining fluid, electrolyte, and nutritional balance, is essential for successful outcomes. In areas where evidence is still emerging, it is vital that multicenter studies be conducted to deepen our understanding and improve the management of SFSS in LDLT. Continued research and collaboration will be crucial in refining strategies and advancing care in this important area of LT.
Notes
Authors’ contribution
J.H Law and A.W.C Kow wrote the manuscript.
Conflicts of Interest
The authors have no conflicts to disclose.