arXiv daily: General Economics

arXiv daily: General Economics (econ.GN)

1.Big Tech's Tightening Grip on Internet Speech

Authors:Gregory M. Dickinson

Abstract: Online platforms have completely transformed American social life. They have democratized publication, overthrown old gatekeepers, and given ordinary Americans a fresh voice in politics. But the system is beginning to falter. Control over online speech lies in the hands of a select few -- Facebook, Google, and Twitter -- who moderate content for the entire nation. It is an impossible task. Americans cannot even agree among themselves what speech should be permitted. And, more importantly, platforms have their own interests at stake: Fringe theories and ugly name-calling drive away users. Moderation is good for business. But platform beautification has consequences for society's unpopular members, whose unsightly voices are silenced in the process. With control over online speech so centralized, online outcasts are left with few avenues for expression. Concentrated private control over important resources is an old problem. Last century, for example, saw the rise of railroads and telephone networks. To ensure access, such entities are treated as common carriers and required to provide equal service to all comers. Perhaps the same should be true for social media. This Essay responds to recent calls from Congress, the Supreme Court, and academia arguing that, like common carriers, online platforms should be required to carry all lawful content. The Essay studies users' and platforms' competing expressive interests, analyzes problematic trends in platforms' censorship practices, and explores the costs of common-carrier regulation before ultimately proposing market expansion and segmentation as an alternate pathway to avoid the economic and social costs of common-carrier regulation.

2.Toward Textual Internet Immunity

Authors:Gregory M. Dickinson

Abstract: Internet immunity doctrine is broken. Under Section 230 of the Communications Decency Act of 1996, online entities are absolutely immune from lawsuits related to content authored by third parties. The law has been essential to the internet's development over the last twenty years, but it has not kept pace with the times and is now deeply flawed. Democrats demand accountability for online misinformation. Republicans decry politically motivated censorship. And Congress, President Biden, the Department of Justice, and the Federal Communications Commission all have their own plans for reform. Absent from the fray, however -- until now -- has been the Supreme Court, which has never issued a decision interpreting Section 230. That appears poised to change, however, following Justice Thomas's statement in Malwarebytes v. Enigma in which he urges the Court to prune back decades of lower-court precedent to craft a more limited immunity doctrine. This Essay discusses how courts' zealous enforcement of the early internet's free-information ethos gave birth to an expansive immunity doctrine, warns of potential pitfalls to reform, and explores what a narrower, text-focused doctrine might mean for the tech industry.

3.Rebooting Internet Immunity

Authors:Gregory M. Dickinson

Abstract: We do everything online. We shop, travel, invest, socialize, and even hold garage sales. Even though we may not care whether a company operates online or in the physical world, however, the question has dramatic consequences for the companies themselves. Online and offline entities are governed by different rules. Under Section 230 of the Communications Decency Act, online entities -- but not physical-world entities -- are immune from lawsuits related to content authored by their users or customers. As a result, online entities have been able to avoid claims for harms caused by their negligence and defective product designs simply because they operate online. The reason for the disparate treatment is the internet's dramatic evolution over the last two decades. The internet of 1996 served as an information repository and communications channel and was well governed by Section 230, which treats internet entities as another form of mass media: Because Facebook, Twitter and other online companies could not possibly review the mass of content that flows through their systems, Section 230 immunizes them from claims related to user content. But content distribution is not the internet's only function, and it is even less so now than it was in 1996. The internet also operates as a platform for the delivery of real-world goods and services and requires a correspondingly diverse immunity doctrine. This Article proposes refining online immunity by limiting it to claims that threaten to impose a content-moderation burden on internet defendants. Where a claim is preventable other than by content moderation -- for example, by redesigning an app or website -- a plaintiff could freely seek relief, just as in the physical world. This approach empowers courts to identify culpable actors in the virtual world and treat like conduct alike wherever it occurs.

1.Post-COVID Inflation & the Monetary Policy Dilemma: An Agent-Based Scenario Analysis

Authors:Max Sina Knicker, Karl Naumann-Woleske, Jean-Philippe Bouchaud, Francesco Zamponi

Abstract: The economic shocks that followed the COVID-19 pandemic have brought to light the difficulty, both for academics and policy makers, of describing and predicting the dynamics of inflation. This paper offers an alternative modelling approach. We study the 2020-2023 period within the well-studied Mark-0 Agent-Based Model, in which economic agents act and react according to plausible behavioural rules. We include in particular a mechanism through which trust of economic agents in the Central Bank can de-anchor. We investigate the influence of regulatory policies on inflationary dynamics resulting from three exogenous shocks, calibrated on those that followed the COVID-19 pandemic: a production/consumption shock due to COVID-related lockdowns, a supply-chain shock, and an energy price shock exacerbated by the Russian invasion of Ukraine. By exploring the impact of these shocks under different assumptions about monetary policy efficacy and transmission channels, we review various explanations for the resurgence of inflation in the United States, including demand-pull, cost-push, and profit-driven factors. Our main results are four-fold: (i)~without appropriate policy, the shocked economy can take years to recover, or even tip over into a deep recession; (ii)~the response to policy is non-monotonic, leading to a narrow window of ``optimal'' policy responses due to the trade-off between inflation and unemployment; (iii)~the success of monetary policy in curbing inflation is primarily due to expectation anchoring, rather than to direct impact of interest rate hikes; (iv)~the two most sensitive model parameters are those describing wage and price indexation. The results of our study have implications for Central Bank decision-making, and offers an easy-to-use tool that may help anticipate the consequences of different monetary and fiscal policies.

2.The shape of business cycles: a cross-country analysis of Friedman s plucking theory

Authors:Emanuel Kohlscheen, Richhild Moessner, Daniel Rees

Abstract: We test the international applicability of Friedman s famous plucking theory of the business cycle in 12 advanced economies between 1970 and 2021. We find that in countries where labour markets are flexible (Australia, Canada, United Kingdom and United States), unemployment rates typically return to pre-recession levels, in line with Friedman s theory. Elsewhere, unemployment rates are less cyclical. Output recoveries differ less across countries, but more across episodes: on average, half of the decline in GDP during a recession persists. In terms of sectors, declines in manufacturing are typically fully reversed. In contrast, construction-driven recessions, which are often associated with bursting property price bubbles, tend to be persistent.

1.Examination of Supernets to Facilitate International Trade for Indian Exports to Brazil

Authors:Evan Winter, Anupam Shah, Ujjwal Gupta, Anshul Kumar, Deepayan Mohanty, Juan Carlos Uribe, Aishwary Gupta, Mini P. Thomas

Abstract: The objective of this paper is to investigate a more efficient cross-border payment and document handling process for the export of Indian goods to Brazil. The paper is structured into two sections: first, to explain the problems unique to the India-Brazil international trade corridor by highlighting the obstacles of compliance, speed, and payments; and second, to propose a digital solution for India-brazil trade utilizing Supernets, focusing on the use case of Indian exports. The solution assumes that stakeholders will be onboarded as permissioned actors (i.e. nodes) on a Polygon Supernet. By engaging trade and banking stakeholders, we ensure that the digital solution results in export benefits for Indian exporters, and a lawful channel to receive hard currency payments. The involvement of Brazilian and Indian banks ensures that Letter of Credit (LC) processing time and document handling occur at the speed of blockchain technology. The ultimate goal is to achieve faster settlement and negotiation period while maintaining a regulatory-compliant outcome, so that the end result is faster and easier, yet otherwise identical to the real-world process in terms of export benefits and compliance.

2.Life after (Soft) Default

Authors:Giacomo De Giorgi, Costanza Naguib

Abstract: We analyze the impact of soft credit default (i.e. a delinquency of 90+ days) on individual trajectories. Using a proprietary dataset on about 2 million individuals for the years 2004 to 2020, we find that a soft default has substantial and long-lasting (i.e. up to ten years after the event) negative effects on credit score, total credit limit, home-ownership status, and income.

1.Paradoxical Oddities in Two Multiwinner Elections from Scotland

Authors:Adam Graham-Squire, David McCune

Abstract: Ranked-choice voting anomalies such as monotonicity paradoxes have been extensively studied through creating hypothetical examples and generating elections under various models of voter behavior. However, very few real-world examples of such voting paradoxes have been found and analyzed. We investigate two single-transferable vote elections from Scotland that demonstrate upward monotonicity, downward monotonicity, no-show, and committee size paradoxes. These paradoxes are rarely observed in real-world elections, and this article is the first case study of such paradoxes in multiwinner elections.

1.Playing the system: address manipulation and access to schools

Authors:Andreas Bjerre-Nielsen, Lykke Sterll Christensen, Mikkel Høst Gandil, Hans Henrik Sievertsen

Abstract: Strategic incentives may lead to inefficient and unequal provision of public services. A prominent example is school admissions. Existing research shows that applicants "play the system" by submitting school rankings strategically. We investigate whether applicants also play the system by manipulating their eligibility at schools. We analyze this applicant deception in a theoretical model and provide testable predictions for commonly-used admission procedures. We confirm these model predictions empirically by analyzing the implementation of two reforms. First, we find that the introduction of a residence-based school-admission criterion in Denmark caused address changes to increase by more than 100% before the high-school application deadline. This increase occurred only in areas where the incentive to manipulate is high-powered. Second, to assess whether this behavior reflects actual address changes, we study a second reform that required applicants to provide additional proof of place of residence to approve an address change. The second reform significantly reduced address changes around the school application deadline, suggesting that the observed increase in address changes mainly reflects manipulation. The manipulation is driven by applicants from more affluent households and their behavior affects non-manipulating applicants. Counter-factual simulations show that among students not enrolling in their first listed school, more than 25% would have been offered a place in the absence of address manipulation and their peer GPA is 0.2SD lower due to the manipulative behavior of other applicants. Our findings show that popular school choice systems give applicants the incentive to play the system with real implications for non-strategic applicants.

1.The Economics of Augmented and Virtual Reality

Authors:Joshua Gans, Abhishek Nagaraj

Abstract: This paper explores the economics of Augmented Reality (AR) and Virtual Reality (VR) technologies within decision-making contexts. Two metrics are proposed: Context Entropy, the informational complexity of an environment, and Context Immersivity, the value from full immersion. The analysis suggests that AR technologies assist in understanding complex contexts, while VR technologies provide access to distant, risky, or expensive environments. The paper provides a framework for assessing the value of AR and VR applications in various business sectors by evaluating the pre-existing context entropy and context immersivity. The goal is to identify areas where immersive technologies can significantly impact and distinguish those that may be overhyped.

1.Social Sustainability of Digital Transformation: Empirical Evidence from EU-27 Countries

Authors:Saeed Nosratabadi, Thabit Atobishi, Szilard HegedHus

Abstract: In the EU-27 countries, the importance of social sustainability of digital transformation (SOSDIT) is heightened by the need to balance economic growth with social cohesion. By prioritizing SOSDIT, the EU can ensure that its citizens are not left behind in the digital transformation process and that technology serves the needs of all Europeans. Therefore, the current study aimed firstly to evaluate the SOSDIT of EU-27 countries and then to model its importance in reaching sustainable development goals (SDGs). The current study, using structural equation modeling, provided quantitative empirical evidence that digital transformation in Finland, the Netherlands, and Denmark are respectively most socially sustainable. It is also found that SOSDIT leads the countries to have a higher performance in reaching SDGs. Finally, the study provided evidence implying the inverse relationship between the Gini coefficient and reaching SDGs. In other words, the higher the Gini coefficient of a country, the lower its performance in reaching SDGs. The findings of this study contribute to the literature of sustainability and digitalization. It also provides empirical evidence regarding the SOSDIT level of EU-27 countries that can be a foundation for the development of policies to improve the sustainability of digital transformation. According to the findings, this study provides practical recommendations for countries to ensure that their digital transformation is sustainable and has a positive impact on society.

2.Modeling the Impact of Mentoring on Women's Work-LifeBalance: A Grounded Theory Approach

Authors:Parvaneh Bahrami, Saeed Nosratabadi, Khodayar Palouzian, Szilard Hegedus

Abstract: The purpose of this study was to model the impact of mentoring on women's work-life balance. Indeed, this study considered mentoring as a solution to create a work-life balance of women. For this purpose, semi-structured interviews with both mentors and mentees of Tehran Municipality were conducted and the collected data were analyzed using constructivist grounded theory. Findings provided a model of how mentoring affects women's work-life balance. According to this model, role management is the key criterion for work-life balancing among women. In this model, antecedents of role management and the contextual factors affecting role management, the constraints of mentoring in the organization, as well as the consequences of effective mentoring in the organization are described. The findings of this research contribute to the mentoring literature as well as to the role management literature and provide recommendations for organizations and for future research.

3.More than Words: Twitter Chatter and Financial Market Sentiment

Authors:Travis Adams, Andrea Ajello, Diego Silva, Francisco Vazquez-Grande

Abstract: We build a new measure of credit and financial market sentiment using Natural Language Processing on Twitter data. We find that the Twitter Financial Sentiment Index (TFSI) correlates highly with corporate bond spreads and other price- and survey-based measures of financial conditions. We document that overnight Twitter financial sentiment helps predict next day stock market returns. Most notably, we show that the index contains information that helps forecast changes in the U.S. monetary policy stance: a deterioration in Twitter financial sentiment the day ahead of an FOMC statement release predicts the size of restrictive monetary policy shocks. Finally, we document that sentiment worsens in response to an unexpected tightening of monetary policy.

4.Validating a dynamic input-output model for the propagation of supply and demand shocks during the COVID-19 pandemic in Belgium

Authors:Tijs W. Alleman, Koen Schoors, Jan M. Baetens

Abstract: This work validates a previously established dynamical input-output model to quantify the impact of economic shocks caused by COVID-19 in the UK using data from Belgium. To this end, we used four time series of economically relevant indicators for Belgium. We identified eight model parameters that could potentially impact the results and varied these parameters over broad ranges in a sensitivity analysis. In this way, we could identify the set of parameters that results in the best agreement to the empirical data and we could asses the sensitivity of our outcomes to changes in these parameters. We find that the model, characterized by relaxing the stringent Leontief production function, provides adequate projections of economically relevant variables during the COVID-19 pandemic in Belgium, both at the aggregated and sectoral levels. The obtained results are robust in light of changes in the input parameters and hence, the model could prove to be a valuable tool in predicting the impact of future shocks caused by armed conflicts, natural disasters, or pandemics.

1.The Emergence of Economic Rationality of GPT

Authors:Yiting Chen, Tracy Xiao Liu, You Shan, Songfa Zhong

Abstract: As large language models (LLMs) like GPT become increasingly prevalent, it is essential that we assess their capabilities beyond language processing. This paper examines the economic rationality of GPT by instructing it to make budgetary decisions in four domains: risk, time, social, and food preferences. We measure economic rationality by assessing the consistency of GPT decisions with utility maximization in classic revealed preference theory. We find that GPT decisions are largely rational in each domain and demonstrate higher rationality scores than those of humans reported in the literature. We also find that the rationality scores are robust to the degree of randomness and demographic settings such as age and gender, but are sensitive to contexts based on the language frames of the choice situations. These results suggest the potential of LLMs to make good decisions and the need to further understand their capabilities, limitations, and underlying mechanisms.

2.Ownership Chains in Multinational Enterprises

Authors:Stefania Miricola, Armando Rungi, Gianluca Santoni

Abstract: In this contribution, we investigate the role of ownership chains developed by multinational enterprises across different national borders. First, we document that parent companies control a majority (58%) of foreign subsidiaries through indirect control relationships involving at least two countries along an ownership chain. Therefore, we hypothesize that locations along ownership chains are driven by the existence of communication costs to transmit management decisions. In line with motivating evidence, we develop a theoretical model for competition on corporate control that considers the possibility that parent companies in the origin countries can delegate their monitoring activities in final subsidiaries to middlemen subsidiaries that are located in intermediate jurisdictions. Our model returns us a two-step empirical strategy with two gravity equations: i) a triangular gravity for establishing a middleman by the parent, conditional on final investments' locations; ii) a classical gravity for the location of final investments. First estimates confirm the predictions that ease of communication at the country level shapes the heterogeneous locations of subsidiaries along global ownership chains.

3.The Key to Organizational and construction Excellence: A Study of Total Quality Management

Authors:M. R. Ibrahim, D. U. Muhammad, B. Muhammad, J. O. Alaezi, J. Agidani

Abstract: This study examines the impact of Total Quality Management (TQM) practices on organizational outcomes. Results show a significant relationship between TQM practices such as top executive commitment, education and teaching, process control, and continuous progress, and how they can be leveraged to enhance performance outcomes.

4.The Missing Link: Exploring the Relationship Between Transformational Leadership and Change in team members in Construction

Authors:M. R. Ibrahim

Abstract: This study aimed to investigate how transformational leadership affects team processes, mediated by change in team members. A self-administered questionnaire was distributed to construction project team members in Abuja and Kaduna, and statistical analysis revealed a significant positive relationship between transformational leadership and team processes, transformational leadership and change in team members, changes in team members and team processes, and changes in team members mediating the relationship between transformational leadership and team processes. Future studies should consider cultural differences.

1.Artificial intelligence moral agent as Adam Smith's impartial spectator

Authors:Nikodem Tomczak

Abstract: Adam Smith developed a version of moral philosophy where better decisions are made by interrogating an impartial spectator within us. We discuss the possibility of using an external non-human-based substitute tool that would augment our internal mental processes and play the role of the impartial spectator. Such tool would have more knowledge about the world, be more impartial, and would provide a more encompassing perspective on moral assessment.

2.AI Regulation in the European Union: Examining Non-State Actor Preferences

Authors:Jonas Tallberg, Magnus Lundgren, Johannes Geith

Abstract: As the development and use of artificial intelligence (AI) continues to grow, policymakers are increasingly grappling with the question of how to regulate this technology. The most far-reaching international initiative is the European Union (EU) AI Act, which aims to establish the first comprehensive framework for regulating AI. In this article, we offer the first systematic analysis of non-state actor preferences toward international regulation of AI, focusing on the case of the EU AI Act. Theoretically, we develop an argument about the regulatory preferences of business actors and other non-state actors under varying conditions of AI sector competitiveness. Empirically, we test these expectations using data on non-state actor preferences from public consultations on European AI regulation. Our findings are threefold. First, all types of non-state actors express concerns about AI and support regulation in some form. Second, there are nonetheless significant differences across actor types, with business actors being less concerned about the downsides of AI and more in favor of lax regulation than other non-state actors. Third, these differences are more pronounced in countries with stronger commercial AI sectors than in countries with lesser developed AI sectors. Our findings shed new light on non-state actor preferences toward AI regulation and point to challenges for policymakers having to balance competing interests.

3.The Global Governance of Artificial Intelligence: Next Steps for Empirical and Normative Research

Authors:Jonas Tallberg, Eva Erman, Markus Furendal, Johannes Geith, Mark Klamberg, Magnus Lundgren

Abstract: Artificial intelligence (AI) represents a technological upheaval with the potential to change human society. Because of its transformative potential, AI is increasingly becoming subject to regulatory initiatives at the global level. Yet, so far, scholarship in political science and international relations has focused more on AI applications than on the emerging architecture of global AI regulation. The purpose of this article is to outline an agenda for research into the global governance of AI. The article distinguishes between two broad perspectives: an empirical approach, aimed at mapping and explaining global AI governance; and a normative approach, aimed at developing and applying standards for appropriate global AI governance. The two approaches offer questions, concepts, and theories that are helpful in gaining an understanding of the emerging global governance of AI. Conversely, exploring AI as a regulatory issue offers a critical opportunity to refine existing general approaches to the study of global governance.

1.Health Impacts of Public Pawnshops in Industrializing Tokyo

Authors:Tatsuki Inoue

Abstract: This study is the first to investigate whether financial institutions for low-income populations have contributed to the historical decline in mortality rates. Using ward-level panel data from prewar Tokyo City, we found that public pawn loans were associated with reductions in infant and fetal death rates, potentially through improved nutrition and hygiene measures. Simple calculations suggest that popularizing public pawnshops led to a 6% and 8% decrease in infant mortality and fetal death rates, respectively, from 1927 to 1935. Contrarily, private pawnshops showed no significant association with health improvements. Our findings enrich the expanding literature on demographics and financial histories.

2.Executive Voiced Laughter and Social Approval: An Explorative Machine Learning Study

Authors:Niklas Mueller, Steffen Klug, Andreas Koenig, Alexander Kathan, Lukas Christ, Bjoern Schuller, Shahin Amiriparian

Abstract: We study voiced laughter in executive communication and its effect on social approval. Integrating research on laughter, affect-as-information, and infomediaries' social evaluations of firms, we hypothesize that voiced laughter in executive communication positively affects social approval, defined as audience perceptions of affinity towards an organization. We surmise that the effect of laughter is especially strong for joint laughter, i.e., the number of instances in a given communication venue for which the focal executive and the audience laugh simultaneously. Finally, combining the notions of affect-as-information and negativity bias in human cognition, we hypothesize that the positive effect of laughter on social approval increases with bad organizational performance. We find partial support for our ideas when testing them on panel data comprising 902 German Bundesliga soccer press conferences and media tenor, applying state-of-the-art machine learning approaches for laughter detection as well as sentiment analysis. Our findings contribute to research at the nexus of executive communication, strategic leadership, and social evaluations, especially by introducing laughter as a highly consequential potential, but understudied social lubricant at the executive-infomediary interface. Our research is unique by focusing on reflexive microprocesses of social evaluations, rather than the infomediary-routines perspectives in infomediaries' evaluations. We also make methodological contributions.

1.Evaluating congestion pricing schemes using agent-based passenger and freight microsimulation

Authors:Peiyu Jing, Ravi Seshadri, Takanori Sakai, Ali Shamshiripour, Andre Romano Alho, Antonios Lentzakis, Moshe E. Ben-Akiva

Abstract: The distributional impacts of congestion pricing have been widely studied in the literature and the evidence on this is mixed. Some studies find that pricing is regressive whereas others suggest that it can be progressive or neutral depending on the specific spatial characteristics of the urban region, existing activity and travel patterns, and the design of the pricing scheme. Moreover, the welfare and distributional impacts of pricing have largely been studied in the context of passenger travel whereas freight has received relatively less attention. In this paper, we examine the impacts of several third-best congestion pricing schemes on both passenger transport and freight in an integrated manner using a large-scale microsimulator (SimMobility) that explicitly simulates the behavioral decisions of the entire population of individuals and business establishments, dynamic multimodal network performance, and their interactions. Through simulations of a prototypical North American city, we find that a distance-based pricing scheme yields the largest welfare gains, although the gains are a modest fraction of toll revenues (around 30\%). In the absence of revenue recycling or redistribution, distance-based and cordon-based schemes are found to be particularly regressive. On average, lower income individuals lose as a result of the scheme, whereas higher income individuals gain. A similar trend is observed in the context of shippers -- small establishments having lower shipment values lose on average whereas larger establishments with higher shipment values gain. We perform a detailed spatial analysis of distributional outcomes, and examine the impacts on network performance, activity generation, mode and departure time choices, and logistics operations.

2.Building resilient organizations: The roles of top-down vs. bottom-up organizing

Authors:Stephan Leitner

Abstract: Organizations face numerous challenges posed by unexpected events such as energy price hikes, pandemic disruptions, terrorist attacks, and natural disasters, and the factors that contribute to organizational success in dealing with such disruptions often remain unclear. This paper analyzes the roles of top-down and bottom-up organizational structures in promoting organizational resilience. To do so, an agent-based model of stylized organizations is introduced that features learning, adaptation, different modes of organizing, and environmental disruptions. The results indicate that bottom-up designed organizations tend to have a higher ability to absorb the effects of environmental disruptions, and situations are identified in which either top-down or bottom-up designed organizations have an advantage in recovering from shocks.

3.The use of trade data in the analysis of global phosphate flows

Authors:Matthias Raddant, Martin Bertau, Gerald Steiner

Abstract: In this paper we present a new method to trace the flows of phosphate from the countries where it is mined to the counties where it is used in agricultural production. We achieve this by combining data on phosphate rock mining with data on fertilizer use and data on international trade of phosphate-related products. We show that by making certain adjustments to data on net exports we can derive the matrix of phosphate flows on the country level to a large degree and thus contribute to the accuracy of material flow analyses, a results that is important for improving environmental accounting, not only for phosphorus but for many other resources.

1.Cost-benefit of green infrastructures for water management: A sustainability assessment of full-scale constructed wetlands in Northern and Southern Italy

Authors:Laura Garcia-Herrero, Stevo Lavrnic, Valentina Guerrieri, Attilio Toscano, Mirco Milani, Giuseppe Luigi Cirelli, Matteo Vittuari

Abstract: Sustainable water management has become an urgent challenge due to irregular water availability patterns and water quality issues. The effect of climate change exacerbates this phenomenon in water-scarce areas, such as the Mediterranean region, stimulating the implementation of solutions aiming to mitigate or improve environmental, social, and economic conditions. A novel solution inspired by nature, technology-oriented, explored in the past years, is constructed wetlands. Commonly applied for different types of wastewater due to its low cost and simple maintenance, they are considered a promising solution to remove pollutants while creating an improved ecosystem by increasing biodiversity around them. This research aims to assess the sustainability of two typologies of constructed wetlands in two Italian areas: Sicily, with a vertical subsurface flow constructed wetland, and Emilia Romagna, with a surface flow constructed wetland. The assessment is performed by applying a cost-benefit analysis combining primary and secondary data sources. The analysis considered the market and non-market values in both proposed scenarios to establish the feasibility of the two options and identify the most convenient one. Results show that both constructed wetlands bring more benefits (benefits-cost ratio, BCR) than costs (BCR > 0). In the case of Sicily, the BCR is lower (1) in the constructed wetland scenario, while in its absence it is almost double. If other ecosystem services are included the constructed wetland scenario reach a BCR of 4 and a ROI of 5, showing a better performance from a costing perspective than the absence one. In Emilia Romagna, the constructed wetland scenario shows a high BCR (10) and ROI (9), while the scenario in absence has obtained a negative present value indicating that the cost do not cover the benefits expected.

1.GPT Agents in Game Theory Experiments

Authors:Fulin Guo

Abstract: This paper explores the potential of using Generative Pre-trained Transformer (GPT)-based agents as participants in strategic game experiments. Specifically, I focus on the finitely repeated ultimatum and prisoner's dilemma games, two well-studied games in economics. I develop prompts to enable GPT agents to understand the game rules and play the games. The results indicate that, given well-crafted prompts, GPT can generate realistic outcomes and exhibit behavior consistent with human behavior in certain important aspects, such as positive relationship between acceptance rates and offered amounts in the ultimatum game and positive cooperation rates in the prisoner's dilemma game. Some differences between the behavior of GPT and humans are observed in aspects like the evolution of choices over rounds. I also study two treatments in which the GPT agents are prompted to either have social preferences or not. The treatment effects are evident in both games. This preliminary exploration indicates that GPT agents can exhibit realistic performance in simple strategic games and shows the potential of using GPT as a valuable tool in social science research.

1.Well-being policy evaluation methodology based on WE pluralism

Authors:Takeshi Kato

Abstract: Methodologies for evaluating and selecting policies that contribute to the well-being of diverse populations need clarification. To bridge the gap between objective indicators and policies related to well-being, this study shifts from constitutive pluralism based on objective indicators to conceptual pluralism that emphasizes subjective context, develops from subject-object pluralism through individual-group pluralism to WE pluralism, and presents a new policy evaluation method that combines joint fact-finding based on policy plurality. First, to evaluate policies involving diverse stakeholders, I develop from individual subjectivity-objectivity to individual subjectivity and group intersubjectivity, and then move to a narrow-wide WE pluralism in the gradation of I-family-community-municipality-nation-world. Additionally, by referring to some functional forms of well-being, I formulate the dependence of well-being on narrow-wide WE. Finally, given that policies themselves have a plurality of social, ecological, and economic values, I define a set of policies for each of the narrow-wide WE and consider a mapping between the two to provide an evaluation basis. Furthermore, by combining well-being and joint fact-finding on the narrow-wide WE consensus, the policy evaluation method is formulated. The fact-value combined parameter system, combined policy-making approach, and combined impact evaluation are disclosed as examples of implementation. This paper contributes to the realization of a well-being society by bridging philosophical theory and policies based on WE pluralism and presenting a new method of policy evaluation based on subjective context and consensus building.

1.Rankings-Dependent Preferences: A Real Goods Matching Experiment

Authors:Andrew Kloosterman, Peter Troyan

Abstract: We investigate whether preferences for objects received via a matching mechanism are influenced by how highly agents rank them in their reported rank order list. We hypothesize that all else equal, agents receive greater utility for the same object when they rank it higher. The addition of rankings-dependent utility implies that it may not be a dominant strategy to submit truthful preferences to a strategyproof mechanism, and that non-strategyproof mechanisms that give more agents objects they report as higher ranked may increase market welfare. We test these hypotheses with a matching experiment in a strategyproof mechanism, the random serial dictatorship, and a non-strategyproof mechanism, the Boston mechanism. A novel feature of our experimental design is that the objects allocated in the matching markets are real goods, which allows us to directly measure rankings-dependence by eliciting values for goods both inside and outside of the mechanism. Our experimental results confirm that the elicited differences in values do decrease for lower-ranked goods. We find no differences between the two mechanisms for the rates of truth-telling and the final welfare.

1.Understand Waiting Time in Transaction Fee Mechanism: An Interdisciplinary Perspective

Authors:Luyao Zhang, Fan Zhang

Abstract: Blockchain enables peer-to-peer transactions in cyberspace without a trusted third party. The rapid growth of Ethereum and smart contract blockchains generally calls for well-designed Transaction Fee Mechanisms (TFMs) to allocate limited storage and computation resources. However, existing research on TFMs must consider the waiting time for transactions, which is essential for computer security and economic efficiency. Integrating data from the Ethereum blockchain and memory pool (mempool), we explore how two types of events affect transaction latency. First, we apply regression discontinuity design (RDD) to study the causal inference of the Merge, the most recent significant upgrade of Ethereum. Our results show that the Merge significantly reduces the long waiting time, network loads, and market congestion. In addition, we verify our results' robustness by inspecting other compounding factors, such as censorship and unobserved delays of transactions via private changes. Second, examining three major protocol changes during the merge, we identify block interval shortening as the most plausible cause for our empirical results. Furthermore, in a mathematical model, we show block interval as a unique mechanism design choice for EIP1559 TFM to achieve better security and efficiency, generally applicable to the market congestion caused by demand surges. Finally, we apply time series analysis to research the interaction of Non-Fungible token (NFT) drops and market congestion using Facebook Prophet, an open-source algorithm for generating time-series models. Our study identified NFT drops as a unique source of market congestion -- holiday effects -- beyond trend and season effects. Finally, we envision three future research directions of TFM.

2.Employer Reputation and the Labor Market: Evidence from and

Authors:Ke Amy, Ma, Sophie Yanying Sheng, Haitian Xie

Abstract: How does employer reputation affect the labor market? We investigate this question using a novel dataset combining reviews from and job applications data from Labor market institutions such as crowd-sources information about employers to alleviate information problems faced by workers when choosing an employer. Raw crowd-sourced employer ratings are rounded when displayed to job seekers. By exploiting the rounding threshold, we identify the causal impact of Glassdoor ratings using a regression discontinuity framework. We document the effects of such ratings on both the demand and supply sides of the labor market. We find that displayed employer reputation affects an employer's ability to attract workers, especially when the displayed rating is "sticky." Employers respond to having a rating above the rounding threshold by posting more new positions and re-activating more job postings. The effects are the strongest for private, smaller, and less established firms, suggesting that online reputation is a substitute for other types of reputation.

3.Surveying Generative AI's Economic Expectations

Authors:Leland Bybee

Abstract: I introduce a survey of economic expectations formed by querying a large language model (LLM)'s expectations of various financial and macroeconomic variables based on a sample of news articles from the Wall Street Journal between 1984 and 2021. I find the resulting expectations closely match existing surveys including the Survey of Professional Forecasters (SPF), the American Association of Individual Investors, and the Duke CFO Survey. Importantly, I document that LLM based expectations match many of the deviations from full-information rational expectations exhibited in these existing survey series. The LLM's macroeconomic expectations exhibit under-reaction commonly found in consensus SPF forecasts. Additionally, its return expectations are extrapolative, disconnected from objective measures of expected returns, and negatively correlated with future realized returns. Finally, using a sample of articles outside of the LLM's training period I find that the correlation with existing survey measures persists -- indicating these results do not reflect memorization but generalization on the part of the LLM. My results provide evidence for the potential of LLMs to help us better understand human beliefs and navigate possible models of nonrational expectations.

1.Black-box Optimizers vs Taste Shocks

Authors:Yasin Kürşat Önder

Abstract: We evaluate and extend the solution methods for models with binary and multiple continuous choice variables in dynamic programming, particularly in cases where a discrete state space solution method is not viable. Therefore, we approximate the solution using taste shocks or black-box optimizers that applied mathematicians use to benchmark their algorithms. We apply these methods to a default framework in which agents have to solve a portfolio problem with long-term debt. We show that the choice of solution method matters, as taste shocks fail to attain convergence in multidimensional problems. We compare the relative advantages of using four optimization algorithms: the Nelder-Mead downhill simplex algorithm, Powell's direction-set algorithm with LINMIN, the conjugate gradient method BOBYQA, and the quasi-Newton Davidon-Fletcher-Powell (DFPMIN) algorithm. All of these methods, except for the last one, are preferred when derivatives cannot be easily computed. Ultimately, we find that Powell's routine evaluated with B-splines, while slow, is the most viable option. BOBYQA came in second place, while the other two methods performed poorly.

2.A Mediation Analysis of the Relationship Between Land Use Regulation Stringency and Employment Dynamics

Authors:Uche Oluku, Shaoming Cheng

Abstract: The paper examines the effects of stringent land use regulations, measured using the Wharton Residential Land Use Regulatory Index (WRLURI), on employment growth during the period 2010-2020 in the Retail, Professional, and Information sectors across 878 local jurisdictions in the United States. All the local jurisdictions exist in both (2006 and 2018) waves of the WRLURI surveys and hence constitute a unique panel data. We apply a mediation analytical framework to decompose the direct and indirect effects of land use regulation stringency on sectoral employment growth and specialization. Our analysis suggests a fully mediated pattern in the relationship between excessive land use regulations and employment growth, with housing cost burden as the mediator. Specifically, a one standard deviation increase in the WRLURI index is associated with an approximate increase of 0.8 percentage point in the proportion of cost burdened renters. Relatedly, higher prevalence of cost-burdened renters has moderate adverse effects on employment growth in two sectors. A one percentage point increase in the proportion of cost burdened renters is associated with 0.04 and 0.017 percentage point decreases in the Professional and Information sectors, respectively.

3.Macroeconomic factors and Stock exchange return: A Statistical Analysis

Authors:Md. Fazlul Huq Khan, Md. Masum Billah

Abstract: The purpose of this research is to examine the relationship between the Dhaka Stock exchange index return and macroeconomic variables such as exchange rate, inflation, money supply etc. The long-term relationship between macroeconomic variables and stock market returns has been analyzed by using the Johnson Cointegration test, Augmented Dicky Fuller (ADF) and Phillip Perron (PP) tests. The results revealed the existence of cointegrating relationship between stock prices and the macroeconomic variables in the Dhaka stock exchange. The consumer price index, money supply, and exchange rates proved to be strongly associated with stock returns, while market capitalization was found to be negatively associated with stock returns. The findings suggest that in the long run, the Dhaka stock exchange is reactive to macroeconomic indicators.

1.Cooperation and Cognition in Social Networks

Authors:Edoardo Gallo, Joseph Lee, Yohanes Eko Riyanto, Erwin Wong

Abstract: Social networks can sustain cooperation by amplifying the consequences of a single defection through a cascade of relationship losses. Building on Jackson et al. (2012), we introduce a novel robustness notion to characterize low cognitive complexity (LCC) networks - a subset of equilibrium networks that imposes a minimal cognitive burden to calculate and comprehend the consequences of defection. We test our theory in a laboratory experiment and find that cooperation is higher in equilibrium than in non-equilibrium networks. Within equilibrium networks, LCC networks exhibit higher levels of cooperation than non-LCC networks. Learning is essential for the emergence of equilibrium play.

1.Disturbance Effects on Financial Timberland Returns in Austria

Authors:Petri P. Karenlampi

Abstract: Probability theory is applied for the effect of severe disturbances on the return rate on capital within multiannual stands growing crops. Two management regimes are discussed, rotations of even-aged plants on the one hand, and uneven-aged semi-stationary state on the other. The effect of any disturbance appears two-fold, contributing to both earnings and capitalization. Results are illustrated using data from a recently published study, regarding spruce (Picea abies) forests in Austria. The economic results differ from those of the paper where the data is presented, here indicating continuous-cover forestry is financially inferior to rotation forestry. Any severe disturbance may induce a regime shift from continuous-cover to even-aged forestry. If such a regime shift is not accepted, the disturbance losses reduce profits but do not affect capitalization, making continuous-cover forestry financially more sensitive to disturbances. Revenue from carbon rent favors the management regime with higher carbon stock. The methods introduced in this paper can be applied to any dataset, regardless of location and tree species.

1.Greening our Laws: Revising Land Acquisition Law for Coal Mining in India

Authors:Sugandha Srivastav, Tanmay Singh

Abstract: Laws that govern land acquisition can lock in old paradigms. We study one such case, the Coal Bearing Areas Act of 1957 (CBAA) which provides minimal social and environmental safegaurds, and deviates in important ways from the Right to Fair Compensation and Transparency in Land Acquisition, Rehabilitation and Resettlement Act 2013 (LARR). The lack of due diligence protocol in the CBAA confers an undue comparative advantage to coal development, which is inconsistent with India's stance to phase down coal use, reduce air pollution, and advance modern sources of energy. We argue that the premise under which the CBAA was historically justified is no longer valid due to a significant change in the local context. Namely, the environmental and social costs of coal energy are far more salient and the market has cleaner energy alternatives that are cost competitive. We recommend updating land acquisition laws to bring coal under the general purview of LARR or, at minimum, amending the CBAA to ensure adequate environmental and social safeguards are in place, both in letter and practice.

2.Political Strategies to Overcome Climate Policy Obstructionism

Authors:Sugandha Srivastav, Ryan Rafaty

Abstract: Great socio-economic transitions see the demise of certain industries and the rise of others. The losers of the transition tend to deploy a variety of tactics to obstruct change. We develop a political-economy model of interest group competition and garner evidence of tactics deployed in the global climate movement. From this we deduce a set of strategies for how the climate movement competes against entrenched hydrocarbon interests. Five strategies for overcoming obstructionism emerge: (1) Appeasement, which involves compensating the losers; (2) Co-optation, which seeks to instigate change by working with incumbents; (3) Institutionalism, which involves changes to public institutions to support decarbonization; (4) Antagonism, which creates reputational or litigation costs to inaction; and (5) Countervailance, which makes low-carbon alternatives more competitive. We argue that each strategy addresses the problem of obstructionism through a different lens, reflecting a diversity of actors and theories of change within the climate movement. The choice of which strategy to pursue depends on the institutional context.

1.Racial and income-based affirmative action in higher education admissions: lessons from the Brazilian experience

Authors:Rodrigo Zeidan, Silvio Luiz de Almeida, Inácio Bó, Neil Lewis Jr

Abstract: This survey article provides insights regarding the future of affirmative action by analyzing the implementation methods and the empirical evidence on the use of placement quotas in the Brazilian higher education system. All federal universities have required income and racial-based quotas in Brazil since 2012. Affirmative action in federal universities is uniformly applied across the country, which makes evaluating its effects particularly valuable. Affirmative action improves the outcomes of targeted students. Specifically, race-based quotas raise the share of black students in federal universities, an effect not observed with income-based quotas alone. Affirmative action has downstream positive consequences for labor market outcomes. The results suggest that income and race-based quotas beneficiaries experience substantial long-term welfare benefits. There is no evidence of mismatching or negative consequences for targeted students' peers.

1.Selecting Sustainable Optimal Stock by Using Multi-Criteria Fuzzy Decision-Making Approaches Based on the Development of the Gordon Model: A case study of the Toronto Stock Exchange

Authors:Mohsen Mortazavi

Abstract: Choosing the right stock portfolio with the highest efficiencies has always concerned accurate and legal investors. Investors have always been concerned about the accuracy and legitimacy of choosing the right stock portfolio with high efficiency. Therefore, this paper aims to determine the criteria for selecting an optimal stock portfolio with a high-efficiency ratio in the Toronto Stock Exchange using the integrated evaluation and decision-making trial laboratory (DEMATEL) model and Multi-Criteria Fuzzy decision-making approaches regarding the development of the Gordon model. In the current study, results obtained using combined multi-criteria fuzzy decision-making approaches, the practical factors, the relative weight of dividends, discount rate, and dividend growth rate have been comprehensively illustrated using combined multi-criteria fuzzy decision-making approaches. A group of 10 experts with at least a ten-year of experience in the stock exchange field was formed to review the different and new aspects of the subject (portfolio selection) to decide the interaction between the group members and the exchange of attitudes and ideas regarding the criteria. The sequence of influence and effectiveness of the main criteria with DEMATEL has shown that the profitability criterion interacts most with other criteria. The criteria of managing methods and operations (MPO), market, risk, and growth criteria are ranked next in terms of interaction with other criteria. This study concludes that regarding the model's appropriate and reliable validity in choosing the optimal stock portfolio, it is recommended that portfolio managers in companies, investment funds, and capital owners use the model to select stocks in the Toronto Stock Exchange optimally.

1.On suspicious tracks: machine-learning based approaches to detect cartels in railway-infrastructure procurement

Authors:Hannes Wallimann, Silvio Sticher

Abstract: In railway infrastructure, construction and maintenance is typically procured using competitive procedures such as auctions. However, these procedures only fulfill their purpose - using (taxpayers') money efficiently - if bidders do not collude. Employing a unique dataset of the Swiss Federal Railways, we present two methods in order to detect potential collusion: First, we apply machine learning to screen tender databases for suspicious patterns. Second, we establish a novel category-managers' tool, which allows for sequential and decentralized screening. To the best of our knowledge, we pioneer illustrating the adaption and application of machine-learning based price screens to a railway-infrastructure market.

2.From Misalignment to Synergy: Analysis of Patents from Indian Universities & Research Institutions

Authors:Shoyeb Khan, Satyendra Kumar Sharma, Arnab Kumar Laha

Abstract: Indian Universities and Research Institutions have been the cornerstone of human resource development in the country, nurturing bright minds and shaping the leaders of tomorrow. Their unwavering commitment to excellence in education and research has not only empowered individuals but has also made significant contributions to the overall growth and progress of the nation. Despite the significant strides made by Indian universities and research institutions, the country still lags behind many developed nations in terms of the number of patents filed as well as in the commercialization of the granted patents. With 34 percent1 of students choosing STEM fields in India, and over 750 Universities and nearly 40,000 colleges, the concentration of patent applications in only a few top 10 institutions raises concerns. Innovation and technological advancement have become key drivers of economic growth and development in modern times. Therefore, our study aims to unravel the patent landscape of Indian Universities and Research Institutions, examining it through the lens of supply and demand for innovations and ideas. Delving into the dynamics of patent filing and innovation trends, this study seeks to shed light on the current state of intellectual property generation in the country's academic and research ecosystem.

1.How 'one-size-fits-all' public works contract does it better? An assessment of infrastructure provision in Italy

Authors:Massimo Finocchiaro Castroa, Calogero Guccio, Ilde Rizzo

Abstract: Public infrastructure procurement is crucial as a prerequisite for public and private investments and for economic and social capital growth. However, low performance in execution severely hinders infrastructure provision and benefits delivery. One of the most sensitive phases in public infrastructure procurement is the design because of the strategic relationship that it potentially creates between procurers and contractors in the execution stage, affecting the costs and the duration of the contract. In this paper, using recent developments in non-parametric frontiers and propensity score matching, we evaluate the performance in the execution of public works in Italy. The analysis provides robust evidence of significant improvement of performance where procurers opt for a design and build contracts, which lead to lower transaction costs, allowing contractors to better accommodate the project in the execution. Our findings bear considerable policy implications.

1.Robust Market Potential Assessment: Designing optimal policies for low-carbon technology adoption in an increasingly uncertain world

Authors:Tom Savage, Antonio del Rio Chanona, Gbemi Oluleye

Abstract: Increasing the adoption of alternative technologies is vital to ensure a successful transition to net-zero emissions in the manufacturing sector. Yet there is no model to analyse technology adoption and the impact of policy interventions in generating sufficient demand to reduce cost. Such a model is vital for assessing policy-instruments for the implementation of future energy scenarios. The design of successful policies for technology uptake becomes increasingly difficult when associated market forces/factors are uncertain, such as energy prices or technology efficiencies. In this paper we formulate a novel robust market potential assessment problem under uncertainty, resulting in policies that are immune to uncertain factors. We demonstrate two case studies: the potential use of carbon capture and storage for iron and steel production across the EU, and the transition to hydrogen from natural gas in steam boilers across the chemicals industry in the UK. Each robust optimisation problem is solved using an iterative cutting planes algorithm which enables existing models to be solved under uncertainty. By taking advantage of parallelisation we are able to solve the nonlinear robust market assessment problem for technology adoption in times within the same order of magnitude as the nominal problem. Policy makers often wish to trade-off certainty with effectiveness of a solution. Therefore, we apply an approximation to chance constraints, varying the amount of uncertainty to locate less certain but more effective solutions. Our results demonstrate the possibility of locating robust policies for the implementation of low-carbon technologies, as well as providing direct insights for policy-makers into the decrease in policy effectiveness resulting from increasing robustness. The approach we present is extensible to a large number of policy design and alternative technology adoption problems.

1.Climate uncertainty impacts on optimal mitigation pathways and social cost of carbon

Authors:Christopher J. Smith, Alaa Al Khourdajie, Pu Yang, Doris Folini

Abstract: Emissions pathways used in climate policy analysis are often derived from integrated assessment models (IAMs). However, such emissions pathways do not typically include climate feedbacks on socioeconomic systems and by extension do not consider climate uncertainty in their construction. Here we show that climate uncertainty alone significantly changes the cost-benefit optimal CO$_2$ emissions, varying from -14 to +12 GtCO$_2$ yr$^{-1}$ in 2050 (5-95% range) for an ensemble of scenarios that limit warming to 1.5{\deg}C with low overshoot. Climate uncertainty is also responsible for a factor of five range in the social cost of carbon (SCC) in this scenario ensemble. Equilibrium climate sensitivity (ECS) and the strength of present-day aerosol radiative forcing are strong determinants of SCC and optimal mid-century CO$_2$ emissions. This confirms that reducing climate uncertainty can refine cost-optimal emissions projections, and points to a missing feedback between climate and emissions in scenario construction.

1.Democratic Policy Decisions with Decentralized Promises Contingent on Vote Outcome

Authors:Ali Lazrak, Jianfeng Zhang

Abstract: We study how decentralized utility transfer promises affect collective decision-making by voting. Committee members with varying levels of support and opposition for an efficient reform can make enforceable promises before voting. An equilibrium requires stability and minimal promises. Equilibrium promises exist and are indeterminate, but do share several key characteristics. Equilibria require transfer promises from high to low intensity members and result in enacting the reform. When reform supporters lack sufficient voting power, promises must reach across the aisle. Even if the coalition of reform supporters is decisive, promises must preclude the least enthusiastic supporters of the reform from being enticed to overturn the decision. In that case, equilibrium promises do not need to reach across the aisle. We also discuss a finite sequence of promises that achieve an equilibrium.

2.Economic consequences of the spatial and temporal variability of climate change

Authors:Francisco Estrada, Richard S. J. Tol, Wouter Botzen

Abstract: Damage functions in integrated assessment models (IAMs) map changes in climate to economic impacts and form the basis for most of estimates of the social cost of carbon. Implicit in these functions lies an unwarranted assumption that restricts the spatial variation (Svar) and temporal variability (Tvar) of changes in climate to be null. This could bias damage estimates and the climate policy advice from IAMs. While the effects of Tvar have been studied in the literature, those of Svar and their interactions with Tvar have not. Here we present estimates of the economic costs of climate change that account for both Tvar and Svar, as well as for the seasonality of damages across sectors. Contrary to the results of recent studies which show little effect that of Tvar on expected losses, we reveal that ignoring Svar produces large downward biases, as warming is highly heterogeneous over space. Using a conservative calibration for the damage function, we show that previous estimates are biased downwards by about 23-36%, which represents additional losses of about US$1,400-US$2,300 billion by 2050 and US$17-US$28 trillion by the end of the century, under a high emissions scenario. The present value of losses during the period 2020-2100 would be larger than reported in previous studies by $47-$66 trillion or about 1/2 to 3/4 of annual global GDP in 2020. Our results imply that using global mean temperature change in IAMs as a summary measure of warming is not adequate for estimating the costs of climate change. Instead, IAMs should include a more complete description of climate conditions.

1.Low-carbon Lithium Extraction Makes Deep Geothermal Plants Cost-competitive in Energy Systems

Authors:Jann Michael Weinand, Ganga Vandenberg, Stanley Risch, Johannes Behrens, Noah Pflugradt, Jochen Linßen, Detlef Stolten

Abstract: Lithium is a critical material for the energy transition, but conventional procurement methods have significant environmental impacts. In this study, we utilize regional energy system optimizations to investigate the techno-economic potential of the low-carbon alternative of direct lithium extraction in deep geothermal plants. We show that geothermal plants will become cost-competitive in conjunction with lithium extraction, even under unfavorable conditions and partially displace photovoltaics, wind power, and storage from energy systems. Our analysis indicates that if 10% of municipalities in the Upper Rhine Graben area in Germany constructed deep geothermal plants, they could provide enough lithium to produce about 1.2 million electric vehicle battery packs per year, equivalent to 70% of today`s annual electric vehicle registrations in the European Union. This approach could offer significant environmental benefits and has high potential for mass application also in other countries, such as the United States, United Kingdom, France, and Italy, highlighting the importance of further research and development of this technology.

1.Visibility graph analysis of the grains and oilseeds indices

Authors:Hao-Ran Liu, Wei-Xing Zhou

Abstract: The Grains and Oilseeds Index (GOI) and its sub-indices of wheat, maize, soyabeans, rice, and barley are daily price indexes reflect the price changes of the global spot markets of staple agro-food crops. In this paper, we carry out a visibility graph (VG) analysis of the GOI and its five sub-indices. Maximum likelihood estimation shows that the degree distributions of the VGs display power-law tails, except for rice. The average clustering coefficients of the six VGs are quite large (>0.5) and exhibit a nice power-law relation with respect to the average degrees of the VGs. For each VG, the clustering coefficients of nodes are inversely proportional to their degrees for large degrees and are correlated to their degrees as a power law for small degrees. All the six VGs exhibit small-world characteristics to some extent. The degree-degree correlation coefficients shows that the VGs for maize and soyabeans indices exhibit weak assortative mixing patterns, while the other four VGs are weakly disassortative. The average nearest neighbor degree functions have similar patterns, and each function shows a more complex mixing pattern which decreases for small degrees, increases for mediate degrees, and decreases again for large degrees.

1.Five guidelines to improve context-aware process selection: an Australian banking perspective

Authors:Nigel Adams, Adriano Augusto, Michael Davern, Marcello La Rosa

Abstract: As the first phase in the Business Process Management (BPM) lifecycle, process identification addresses the problem of identifying which processes to prioritize for improvement. Process selection plays a critical role in this phase, but it is a step with known pitfalls. Decision makers rely frequently on subjective criteria, and their knowledge of the alternative processes put forward for selection is often inconsistent. This leads to poor quality decision-making and wastes resources. In recent years, a rejection of a one-size-fits-all approach to BPM in favor of a more context-aware approach has gained significant academic attention. In this study, the role of context in the process selection step is considered. The context is qualitative, subjective, sensitive to decision-making bias and politically charged. We applied a design-science approach and engaged industry decision makers through a combination of research methods to assess how different configurations of process inputs influence and ultimately improve the quality of the process selection step. The study highlights the impact of framing effects on context and provides five guidelines to improve effectiveness.

2.Mapping job complexity and skills into wages

Authors:Sabrina Aufiero, Giordano De Marzo, Angelica Sbardella, Andrea Zaccaria

Abstract: We use algorithmic and network-based tools to build and analyze the bipartite network connecting jobs with the skills they require. We quantify and represent the relatedness between jobs and skills by using statistically validated networks. Using the fitness and complexity algorithm, we compute a skill-based complexity of jobs. This quantity is positively correlated with the average salary, abstraction, and non-routinarity level of jobs. Furthermore, coherent jobs - defined as the ones requiring closely related skills - have, on average, lower wages. We find that salaries may not always reflect the intrinsic value of a job, but rather other wage-setting dynamics that may not be directly related to its skill composition. Our results provide valuable information for policymakers, employers, and individuals to better understand the dynamics of the labor market and make informed decisions about their careers.

3.Adapting to Disruptions: Flexibility as a Pillar of Supply Chain Resilience

Authors:Ambra Amico, Luca Verginer, Giona Casiraghi, Giacomo Vaccario, Frank Schweitzer

Abstract: Supply chain disruptions cause shortages of raw material and products. To increase resilience, i.e., the ability to cope with shocks, substituting goods in established supply chains can become an effective alternative to creating new distribution links. We demonstrate its impact on supply deficits through a detailed analysis of the US opioid distribution system. Reconstructing 40 billion empirical distribution paths, our data-driven model allows a unique inspection of policies that increase the substitution flexibility. Our approach enables policymakers to quantify the trade-off between increasing flexibility, i.e., reduced supply deficits, and increasing complexity of the supply chain, which could make it more expensive to operate.

1.On the state-space model of unawareness

Authors:Alex A. T. Rathke

Abstract: We show that the knowledge of an agent carrying non-trivial unawareness violates the standard property of 'necessitation', therefore necessitation cannot be used to refute the standard state-space model. A revised version of necessitation preserves non-trivial unawareness and solves the classical Dekel-Lipman-Rustichini result. We propose a generalised knowledge operator consistent with the standard state-space model of unawareness, including the model of infinite state-space.