Creditrisk, also known as default risk, is the likelihood of a corporation losing money if a business partner defaults. If the liabilities are not met under the terms of the contract, the firm may default, resulting in the loss of the company. There is no clear way to distinguish between organizations that will default and those that will not prior to default. We can only make probabilistic estimations of the risk of default at best. There (...) are two types of creditrisk default models in this regard: structural and reduced-form models. Structural models are used to calculate the likelihood of a company defaulting based on its assets and liabilities. If the market worth of a company's assets is less than the debt it owes, it will default. Reduced form models often assume an external cause of default, such as a Poisson jump process, which is driven by a stochastic process. They model default as a random event with no regard for the balance sheet of the company. This paper provides a Review of creditrisk default models. (shrink)
This article presents a comprehensive framework for valuing financial instruments subject to creditrisk. In particular, we focus on the impact of default dependence on asset pricing, as correlated default risk is one of the most pervasive threats in financial markets. We analyze how swap rates are affected by bilateral counterparty creditrisk, and how CDS spreads depend on the trilateral creditrisk of the buyer, seller, and reference entity in a contract. Moreover, (...) we study the effect of collateralization on valuation, since the majority of OTC derivatives are collateralized. The model shows that a fully collateralized swap is risk-free, whereas a fully collateralized CDS is not equivalent to a risk-free one. (shrink)
This paper argues that the reduced-form jump diffusion model may not be appropriate for creditriskmodeling. To correctly value hybrid defaultable financial instruments, e.g., convertible bonds, we present a new framework that relies on the probability distribution of a default jump rather than the default jump itself, as the default jump is usually inaccessible. As such, the model can back out the market prices of convertible bonds. A prevailing belief in the market is that convertible arbitrage (...) is mainly due to convertible underpricing. Empirically, however, we do not find evidence supporting the underpricing hypothesis. Instead, we find that convertibles have relatively large positive gammas. As a typical convertible arbitrage strategy employs delta-neutral hedging, a large positive gamma can make the portfolio highly profitable, especially for a large movement in the underlying stock price. (shrink)
This article presents a new model for valuing a credit default swap (CDS) contract that is affected by multiple credit risks of the buyer, seller and reference entity. We show that default dependency has a significant impact on asset pricing. In fact, correlated default risk is one of the most pervasive threats in financial markets. We also show that a fully collateralized CDS is not equivalent to a risk-free one. In other words, full collateralization cannot eliminate (...) counterparty risk completely in the CDS market. (shrink)
This paper presents a Least Square Monte Carlo approach for accurately calculating credit value adjustment (CVA). In contrast to previous studies, the model relies on the probability distribution of a default time/jump rather than the default time itself, as the default time is usually inaccessible. As such, the model can achieve a high order of accuracy with a relatively easy implementation. We find that the valuation of a defaultable derivative is normally determined via backward induction when their payoffs could (...) be positive or negative. Moreover, the model can naturally capture wrong or right way risk. (shrink)
Currently, under the conditions of permanent financial risks that hamper the sustainable economic growth in the financial sector, the development of evaluation and risk management methods both regulated by Basel II and III and others seem to be of special importance. The reputation risk is one of significant risks affecting reliability and credibility of commercial banks. The importance of reputation risk management and the quality of their assessment remain relevant as the probability of decrease in or loss (...) of business reputation influences the financial results and the degree of customers’, partners’ and stakeholders’ confidence. By means of imitating modeling based on Bayesian Networks and the fuzzy data analysis, the article characterizes the mechanism of reputation risk assessment and possible losses evaluation in banks by plotting normal and lognormal distribution functions. Monte-Carlo simulation is used to calculate the probability of losses caused by reputation risks. The degree of standardized histogram similarity is determined on the basis of the fuzzy data analysis applying Hamming distance method. The tree-like hierarchy based on the OWA-operator is used to aggregate the data with Fishburne's coefficients as the convolution scales. The mechanism takes into account the impact of criteria, such as return on equity, goodwill value, the risk assets ratio, the share of the productive assets in net assets, the efficiency ratio of interest bearing liabilities, the risk ratio of credit operations, the funding ratio and reliability index on the business reputation of the bank. The suggested methods and recommendations might be applied to develop the decision-making mechanism targeted at the implementation of reputation risk management system in commercial banks as well as to optimize risk management technologies. (shrink)
This article presents a new model for valuing financial contracts subject to creditrisk and collateralization. Examples include the valuation of a credit default swap (CDS) contract that is affected by the trilateral creditrisk of the buyer, seller and reference entity. We show that default dependency has a significant impact on asset pricing. In fact, correlated default risk is one of the most pervasive threats in financial markets. We also show that a fully (...) collateralized CDS is not equivalent to a risk-free one. In other words, full collateralization cannot eliminate counterparty risk completely in the CDS market. (shrink)
This paper presents a new model for valuing hybrid defaultable financial instruments, such as, convertible bonds. In contrast to previous studies, the model relies on the probability distribution of a default jump rather than the default jump itself, as the default jump is usually inaccessible. As such, the model can back out the market prices of convertible bonds. A prevailing belief in the market is that convertible arbitrage is mainly due to convertible underpricing. Empirically, however, we do not find evidence (...) supporting the underpricing hypothesis. Instead, we find that convertibles have relatively large positive gammas. As a typical convertible arbitrage strategy employs delta-neutral hedging, a large positive gamma can make the portfolio highly profitable, especially for a large movement in the underlying stock price. (shrink)
There is a commonly made distinction between two types of scientists: risk-taking, trailblazing mavericks and detail-oriented followers. A number of recent papers have discussed the question what a desirable mixture of mavericks and followers looks like. Answering this question is most useful if a scientific community can be steered toward such a desirable mixture. One attractive route is through credit incentives: manipulating rewards so that reward-seeking scientists are likely to form the desired mixture of their own accord. Here (...) I argue that this idea is less straightforward than it may seem. Interpreting mavericks as scientists who prioritize rewards over speed and risk, I show in a deliberatively simple model that there is a fixed mixture which is not particularly likely to be desirable and which credit incentives cannot alter. I consider a way around this result, but this has some major drawbacks. I conclude that credit incentives are not as promising a way to create a desirable mixture of mavericks and followers as one might have thought. (shrink)
This article presents a generic model for pricing financial derivatives subject to counterparty creditrisk. Both unilateral and bilateral types of credit risks are considered. Our study shows that creditrisk should be modeled as American style options in most cases, which require a backward induction valuation. To correct a common mistake in the literature, we emphasize that the market value of a defaultable derivative is actually a risky value rather than a risk-free value. (...)Credit value adjustment (CVA) is also elaborated. A practical framework is developed for pricing defaultable derivatives and calculating their CVAs at a portfolio level. (shrink)
This chapter argues for deregulation of the credit-rating market. Credit-rating agencies are supposed to contribute to the informational needs of investors trading bonds. They provide ratings of debt issued by corporations and governments, as well as of structured debt instruments (e.g. mortgage-backed securities). As many academics, regulators, and commentators have pointed out, the ratings of structured instruments turned out to be highly inaccurate, and, as a result, they have argued for tighter regulation of the industry. This chapter shows, (...) however, that the role of credit-rating agencies in achieving justice in finance is not as great as these commentators believe. It therefore argues instead for deregulation. Since the 1930s, lawgivers have unjustifiably elevated the rating agencies into official, legally binding sources of information concerning creditrisk, thereby unjustifiably causing many institutional investors to outsource their epistemic responsibilities, that is, their responsibility to investigate creditrisk themselves. (shrink)
Enterprise Risk Management and security have become a fundamental part of Enterprise Architecture, so several frameworks and modeling languages have been designed to support the activities associated with these areas. Archi- Mate’s Risk and Security Overlay is one of such proposals, endorsed by The Open Group. We investigate the capabilities of the proposed security-related con- structs in ArchiMate with regard to the necessities of enterprise security modeling. Our analysis relies on a well-founded reference ontology of security (...) to uncover ambiguity, missing modeling elements, and other deficiencies of the security mod- eling capabilities in ArchiMate. Based on this ontologically-founded analysis, we propose a redesign of security aspects of ArchiMate to overcome its original limitations. (shrink)
This paper provides a method for characterizing space events using the framework of conceptual spaces. We focus specifically on estimating and ranking the likelihood of collisions between space objects. The objective is to design an approach for anticipatory decision support for space operators who can take preventive actions on the basis of assessments of relative risk. To make this possible our approach draws on the fusion of both hard and soft data within a single decision support framework. Contextual data (...) is also taken into account, for example data about space weather effects, by drawing on the Space Domain Ontologies, a large system of ontologies designed to support all aspects of space situational awareness. The framework is coupled with a mathematical programming scheme that frames a mathematically optimal approach for decision support, providing a quantitative basis for ranking potential for collision across multiple satellite pairs. The goal is to provide the broadest possible information foundation for critical assessments of collision likelihood. (shrink)
In Risk Management, security issues arise from complex relations among objects and agents, their capabilities and vulnerabilities, the events they are involved in, and the value and risk they ensue to the stakeholders at hand. Further, there are patterns involving these relations that crosscut many domains, ranging from information security to public safety. Understanding and forming a shared conceptualization and vocabulary about these notions and their relations is fundamental for modeling the corresponding scenarios, so that proper security (...) countermeasures can be devised. Ontologies are instruments developed to address these conceptual clarification and terminological systematization issues. Over the years, several ontologies have been proposed in Risk Management and Security Engineering. However, as shown in recent literature, they fall short in many respects, including generality and expressivity - the latter impacting on their interoperability with related models. We propose a Reference Ontology for Security Engineering (ROSE) from a Risk Treatment perspective. Our proposal leverages on two existing Reference Ontologies: the Common Ontology of Value and Risk and a Reference Ontology of Prevention, both of which are grounded on the Unified Foundational Ontology (UFO). ROSE is employed for modeling and analysing some cases, in particular providing clarification to the semantically overloaded notion of Security Mechanism. (shrink)
The paper explores the influence of greenwash on green trust and discusses the mediation roles of green consumer confusion and green perceived risk. The research object of this study focuses on Taiwanese consumers who have the purchase experience of information and electronics products in Taiwan. This research employs an empirical study by means of the structural equation modeling. The results show that greenwash is negatively related to green trust. Therefore, this study suggests that companies must reduce their greenwash (...) behaviors to enhance their consumers’ green trust. In addition, this study finds out that green consumer confusion and green perceived risk mediate the negative relationship between greenwash and green trust. The results also demonstrate that greenwash is positively associated with green consumer confusion and green perceived risk which would negatively affect green trust. It means that greenwash does not only negatively affect green trust directly but also negatively influence it via green consumer confusion and green perceived risk indirectly. Hence, if companies would like to reduce the negative relationship between greenwash and green trust, they need to decrease their consumers’ green consumer confusion and green perceived risk. (shrink)
The mental risk poses a high threat to the individuals, especially overseas demographic, including expatriates in comparison to the general Arab demographic. Since Arab countries are renowned for their multicultural environment with half of the population of students and faculties being international, this paper focuses on a comprehensive analysis of mental health problems such as depression, stress, anxiety, isolation, and other unfortunate conditions. The dataset is developed from a web-based survey. The detailed exploratory data analysis is conducted on the (...) dataset collected from Arab countries to study an individual’s mental health and indicative help-seeking pointers based on their responses to specific pre-defined questions in a multicultural society. The proposed model validates the claims mathematically and uses different machine learning classifiers to identify individuals who are either currently or previously diagnosed with depression or demonstrate unintentional “save our souls” (SOS) behaviors for an early prediction to prevent risks of danger in life going forward. The accuracy is measured by comparing with the classifiers using several visualization tools. This analysis provides the claims and authentic sources for further research in the multicultural public medical sector and decision-making rules by the government. (shrink)
Lockdowns, or modern quarantines, involve the use of novel restrictive non-pharmaceutical interventions (NPIs) to suppress the transmission of COVID-19. In this paper, I aim to critically analyze the emerging history and philosophy of lockdowns, with an emphasis on the communication of health evidence and risk for informing policy decisions. I draw a distinction between evidence-based and modeling-based decision-making. I argue that using the normative framework of evidence-based medicine would have recommended against the use of lockdowns. I first review (...) the World Health Organization’s evidence-based pandemic preparedness plans for respiratory viruses. I then provide a very brief history of COVID-19 modeling, which was cited as justification for the use of lockdowns in the U.K., the U.S., and much of the world. I focus on the so-called Imperial College model designed by Neil Ferguson et al. as well as the so-called Oxford model designed by José Lourenço et al. I analyze the evidence-based pandemic response known as ‘mitigation’, and I compare it with Ferguson et al.’s experimental strategy known as ‘suppression’. I summarize the strengths and weaknesses of these strategies based on their diametric aims and each model’s parametric assumptions. Based on my critical analysis of the suppression strategy, I attempt to expose what has been called the ‘logic of lockdowns’, which Sunetra Gupta of the Oxford model group has suggested is flawed. Finally, I consider Trisha Greenhalgh’s objection to evidence-based policy based on the precautionary principle, and I attempt to offer a response. I conclude with a brief narrative review of the emerging randomized evidence on restrictive NPIs, which seems to support my claim that mitigation was the strategy that would have been recommended by evidence-based medicine. If this is true, then COVID-19 modeling may serve as an important reminder of the enduring lesson of evidence-based medicine: that one should always ‘Trust the Evidence!’ for better health policy. (shrink)
This article discusses the management of cycli- cal dynamics of the credit sphere. Considering the fact that conventional methods of monetary policy proved to be vulnerable to the process of accumula- tion and realization of risk in the credit sector, we aim to study heterodox methods of credit market regula- tion – administrative (direct) methods of control over fluctuations on the credit market. In particular, based on the analysis of credit limits used in the (...) Socialist Republic of Vietnam, we come to conclusion about comparative effectiveness of this method in curbing excessive amplitude of the credit cycle. However, the price of using this method may be connected with re- duction in the rate of economic growth and a shift in the accumulation of risk to other areas of the credit market. (shrink)
This paper presents an analytical model for valuing interest rate swaps, subject to bilateral counterparty creditrisk. The counterparty defaults are modeled by the reduced-form model as the first jump of a time-inhomogeneous Poisson process. All quantities modeled are market-observable. The closed-form solution gives us a better understanding of the impact of the credit asymmetry on swap value, credit value adjustment, swap rate and swap spread.
The article examines the process of formation inventory of the enterprise and determines the optimal volume of commodity resources for sale. A generalization of author’s approaches to the formation and evaluation of inventories of the enterprise is carried out. The marketing-logistic approach was applied for the purpose of distribution groups of commodity resources due to the risk of non-fulfillment the order for the supply of goods of the enterprise. In order to ensure an effective process of commodity provision of (...) the enterprises, the costs associated with the formation of inventories are determined. The formalized scheme of the formation commodity provision and the process of optimization inventory of the enterprise is offered. The analysis the structure of the company’s inventory is carried out, the volume of goods turnover is defined, the stocks are grouped for various clustering characteristics. To conduct the study, statistical information was used on commodity resources of the enterprises, statistical methods (grouping, structure analysis, estimation of dynamic series), tools for assessing the efficiency of inventory use, HML-FMR clustering were used. The necessity of using XYZ and ABC analysis is indicated in order to obtain more reliable results and forecast values of the product support of the enterprise. Economic-mathematical modeling is applied and graphically shown the difference in the formation of commodity resources by various features of HML-FMR clustering. The calculations allow the enterprise to determine the optimal amount of commodity resources in accordance with the needs of consumers and their solvent demand, to plan financial resources for the formation of inventories, to develop assortment policy in accordance with demand for products and their implementation. The results of calculations the volume of merchandising of the enterprise, taking into account the HML-FMR clustering affect the formation of final financial performance of the enterprise — income and profits. (shrink)
This paper presents a new model for pricing OTC derivatives subject to collateralization. It allows for collateral posting adhering to bankruptcy laws. As such, the model can back out the market price of a collateralized contract. This framework is very useful for valuing outstanding derivatives. Using a unique dataset, we find empirical evidence that creditrisk alone is not overly important in determining credit-related spreads. Only accounting for both collateral arrangement and creditrisk can sufficiently (...) explain unsecured credit costs. This finding suggests that failure to properly account for collateralization may result in significant mispricing of derivatives. We also empirically gauge the impact of collateral agreements on risk measurements. Our findings indicate that there are important interactions between market and creditrisk. (shrink)
Economic models describe individuals in terms of underlying characteristics, such as taste for some good, sympathy level for another player, time discount rate, risk attitude, and so on. In real life, such characteristics change through experiences: taste for Mozart changes through listening to it, sympathy for another player through observing his moves, and so on. Models typically ignore change, not just for simplicity but also because it is unclear how to incorporate change. I introduce a general axiomatic framework for (...) defining, analysing and comparing rival models of change. I show that seemingly basic postulates on modelling change together have strong implications, like irrelevance of the order in which someone has his experiences and ‘linearity’ of change. This is a step towards placing the modelling of change on solid axiomatic grounds and enabling non-arbitrary incorporation of change into economic models. (shrink)
In this chapter, one considers finance at its very foundations, namely, at the place where assumptions are being made about the ways to measure the two key ingredients of finance: risk and return. It is well known that returns for a large class of assets display a number of stylized facts that cannot be squared with the traditional views of 1960s financial economics (normality and continuity assumptions, i.e. Brownian representation of market dynamics). Despite the empirical counterevidence, normality and continuity (...) assumptions were part and parcel of financial theory and practice, embedded in all financial practices and beliefs. Our aim is to build on this puzzle for extracting some clues revealing the use of one research strategy in academic community, model tinkering defined as a particular research habit. We choose to focus on one specific moment of the scientific controversies in academic finance: the ‘leptokurtic crisis’ opened by Mandelbrot in 1962. The profoundness of the crisis came from the angle of the Mandelbrot’s attack: not only he emphasized an empirical inadequacy of the Brownian representation, but also he argued for an inadequate grounding of this representation. We give some insights in this crisis and display the model tinkering strategies of the financial academic community in the 1970s and the 1980s. (shrink)
The authors of the book have come to the conclusion that it is necessary to effectively use modern approaches the management of innovative development the economic entities in order to increase the efficiency of activity, to ensure competitiveness, to intensify innovation activity. Basic research focuses on assessing the competition of economic entities, internal control in organizations, analysis of creditrisk, diagnostics of sources of funding for innovation, assessment of social innovation and human development factors. The research results have (...) been implemented in the different models of reengineering business process, development of alternative agriculture, the digital economy, knowledge management. The results of the study can be used in decision-making at the level the economic entities in different areas of activity and organizational-legal forms of ownership, ministries and departments that promote of development the economic entities on an innovative basis. The results can also be used by students and young scientists in modern concepts and mechanisms for management of innovative development the economic entities in the context of efficient use the resource potential and improvement of innovation policy. (shrink)
This paper attempts to assess the economic significance and implications of collateralization in different financial markets, which is essentially a matter of theoretical justification and empirical verification. We present a comprehensive theoretical framework that allows for collateralization adhering to bankruptcy laws. As such, the model can back out differences in asset prices due to collateralized counterparty risk. This framework is very useful for pricing outstanding defaultable financial contracts. By using a unique data set, we are able to achieve a (...) clean decomposition of prices into their creditrisk factors. We find empirical evidence that counterparty risk is not overly important in credit-related spreads. Only the joint effects of collateralization and creditrisk can sufficiently explain unsecured credit costs. This finding suggests that failure to properly account for collateralization may result in significant mispricing of financial contracts. We also analyze the difference between cleared and OTC markets. (shrink)
The article provides a critical assessment of The Central Bank of the Russian Federation policy in response to the sanctions of the US, the EU, the UK, Switzerland, Japan, South Korea and a number of other countries. The effect of sanctions on the Russian economy and its financial market is viewed through the prism of credit, interest rate, and currency risk, and the risk of a decline in business activity. Special attention is paid to the inflationary component (...) and inflationary expectations of the Russian Federation, as well as to the forecasts for a decline in business activity in Russia. A critical assessment is given to the actions of the Central Bank of the Russian Federation and the economic bloc of the government of the Russian Federation as a whole in response to the sanctions of the civilized world, which disable the normal existence of the economy and the main purpose of which is not to destroy the economy of the Russian Federation but to ensure the end of hostilities on the European continent. The results of our study will be useful to everyone who studies the problems of the effect of economic sanctions on the resource-based economy and the processes of stimulating political decisions by economic methods. (shrink)
Climate engineering with stratospheric sulfate aerosol injections (SSAI) has the potential to reduce risks of injustice related to anthropogenic emissions of greenhouse gases. Relying on evidence from modeling studies, this paper makes the case that SSAI could have the potential to reduce many of the key physical risks of climate change identified by the Intergovernmental Panel on Climate Change. Such risks carry potential injustice because they are often imposed on low-emitters who do not benefit from climate change. Because SSAI (...) has the potential to reduce those risks, it thereby has the potential to reduce the injustice associated with anthropogenic emissions. While acknowledging important caveats, including uncertainty in modeling studies and the potential for SSAI to carry its own risks of injustice, the paper argues that there is a strong case for continued research into SSAI, especially if attention is paid to how it might be used to reduce emissions-driven injustice. (shrink)
Clearing algorithms are at the core of modern payment systems, facilitating the settling of multilateral credit messages with (near) minimum transfers of currency. Traditional clearing procedures use batch processing based on MILP - mixed-integer linear programming algorithms. The MILP approach demands intensive computational resources; moreover, it is also vulnerable to operational risks generated by possible defaults during the inter-batch period. This paper presents TORC3 - the Token-Ring Clearing Algorithm for Currency Circulation. In contrast to the MILP approach, TORC3 is (...) a real time heuristic procedure, demanding modest computational resources, and able to completely shield the clearing operation against the participating agents’ risk of default. (shrink)
I argue that epistemic failings are a significant and underappreciated moral hazard in the financial services industry. I argue further that an analysis of these epistemic failings and their means of redress is best developed by identifying policies and procedures that are likely to facilitate good judgment. These policies and procedures are “best epistemic practices.” I explain how best epistemic practices support good reasoning, thereby facilitating accurate judgments about risk and reward. Failures to promote and adhere to best epistemic (...) practices contributed to the 2008 financial crisis. I identify and discuss some of the ways in which best epistemic practices were violated in the events that led to the crisis, with a focus on the role of the credit rating agencies. I go on to discuss some of the ways in which these failings have been redressed. I conclude by observing how proactive regulation for best epistemic practices might help us to anticipate and avoid future crises. (shrink)
Nowadays a lot of new financial instruments for SMEs innovation projects are getting more popular in the business environment. A great number of them are traditional like credit loans at the same time we can observe the appearance of innovative ones. Variable set of financial instruments generalized on fig.1. This classification is giving by Organiztion of Economic Cooperation and Development ( OECD). As we can see from the fig.1 the classification is based on risk approach. For further research (...) it is needed to сlarify these definitions. Firstly, asset based finance is the method of assigning structured turnaround capital and term loans, disbursement of debit portfolio, stocks, machines, funds, and / or real estate. This type of finding is suitable for SMEs beginners, refinancing existing loans, growth financing, mergers and acquisitions. (shrink)
Philosophers of science and metascientists alike typically model scientists’ behavior as driven by credit maximization. In this article I argue that this modeling assumption cannot account for how scientists have a default level of trust in each other’s assertions. The normative implication of this is that science policy should not focus solely on incentive reform.
Libertarian self-ownership views in the tradition of Locke, Nozick, and the left-libertarians have supposed that we enjoy very powerful deontological protections against infringing upon our property. Such a conception makes sense when we are focused on property that is very important to its owner, such as a person’s kidney. However, this stringency of our property rights is harder to credit when we consider more trivial infringements such as very mildly toxic pollution or trivial risks such having planes fly overhead. (...) Maintaining that our rights against all infringements are very powerful threatens to implausibly make such pollution and trivial risk broadly impermissible. This paper suggests that self-ownership views have tended to inappropriately conflate the seriousness of different types of infringements and that treating all infringements so seriously is implausible because it would make too much impermissible. I consider several ways to avoid this result within a self-ownership framework and conclude that the best approach is to allow that the strength of the protection against infringements should be tied to the seriousness of the harm of the infringement. (shrink)
Many epidemics consist in individuals spreading infection to others. From the population perspective, they also have population characteristics important in modeling, explaining, and intervening in epidemics. I analyze epidemiology’s contemporary population perspective through the example of epidemics by examining two central principles attributed to Geoffrey Rose: a distinction between the causes of cases and the causes of incidence, and between “high-risk” and “population” strategies of prevention. Both principles require revision or clarification to capture the sense in which they (...) describe distinct perspectives on the same phenomenon, each perspective capturing a different level of contrastive analysis. (shrink)
In this article, we describe a project in which philosophy, in combination with methods drawn from mental modeling, was used to structure dialogue among stakeholders in a region-scale climate adaptation process. The case study we discuss synthesizes the Toolbox dialogue method, a philosophically grounded approach to enhancing communication and collaboration in complex research and practice, with a mental modeling approach rooted in risk analysis, assessment, and communication to structure conversations among non-academic stakeholders who have a common interest (...) in planning for a sustainable future. We begin by describing the background of this project, including details about climate resiliency efforts in West Michigan and the Toolbox dialogue method, which was extended in this project from academic research into community organization involving the West Michigan Climate Resiliency Framework Initiative. This extension involved application of several methods, which are the focus of the Methods section. We then present and discuss preliminary results that suggest the potential for philosophical dialogue to enhance mutual understanding in complex community initiatives that focus on sustainable responses to climate change. Overall, the article supplies a detailed, instructive example of how philosophy can support policy-relevant decision-making processes at the community level. (shrink)
To find the neural substrates of consciousness, researchers compare subjects’ neural activity when they are aware of stimuli against neural activity when they are not aware. Ideally, to guarantee that the neural substrates of consciousness—and nothing but the neural substrates of consciousness—are isolated, the only difference between these two contrast conditions should be conscious awareness. Nevertheless, in practice, it is quite challenging to eliminate confounds and irrelevant differences between conscious and unconscious conditions. In particular, there is an often-neglected confound that (...) is crucial to eliminate from neuroimaging studies: task performance. Unless subjects’ task performance is matched (and hence perceptual signal processing is matched), researchers risk finding the neural correlates of perception, rather than conscious perception. Here, we discuss the theoretical motivations for the performance matching framework and review empirical demonstrations of, and theoretical inferences derived from, obtaining differences in consciousness while controlling for task performance. We summarize signal detection theoretic modeling frameworks that explain how it is that we can derive performance-matched differences in consciousness without the effect being trivially driven by differences in criterion setting, and also provide principles for designing experimental paradigms that yield performance-matched differences in awareness. Finally, we address potential technical and theoretical issues that stem from matching performance across conditions of awareness, and we introduce the notion of “triangulation” for designing comprehensive experimental sets that can better reveal the neural substrates of consciousness. (shrink)
Termination in international contracts is considered a harsh sanction that harms international trade for each breach of contract or its provisions. The interest of international trade is fulfilled in maintaining and completing performance of contract, even if with a breach rectifiable by remedy. The termination destroys the contract and results in returning goods after their dispatch in addition to the accompanying new freight and insurance expenses and administrative and health procedures necessary for the entry and exit of goods and to (...) pay then refund the price. Moreover, the goods are exposed again to damage and perishing risks. Furthermore, the international sale contract is inherently associated with other international contracts such as goods transport contract, insurance contract and documentary credit through which the price is paid. If the sale contract is terminated, its effect will apply to all other associated contracts, if not performed, which produces many issues and hardships. (shrink)
The use of evolutionary game theory to explain the evolution of human norms and the behavior of humans who act according to those norms is widespread. Both the aims and motivation for its use are clearly articulated by Harms and Skyrms (2008) in the following passage: "A good theory of evolution of norms might start by explaining the evolution of altruism in Prisoner’s Dilemma, of Stag Hunting, and of the equal split in the symmetric bargaining game. These are not well-explained (...) by classical game theory based on rational choice. From a technical point of view, they present different theoretical challenges. In the bargaining game, there are an infinite number of equilibria with no principled (rational choice) way to select the cooperative one. In Stag Hunt there are only two, but the non-cooperative one is selected by risk-dominance. In Prisoner’s Dilemma the state of mutual cooperation is not a Nash equilibrium at all, and cooperation flies in the face of the rational-choice principle that one does not choose less rather than more. In contrast to rational choice theory, the most common tool of evolutionary game theory is the replicator dynamics, in which the propagation rate of each strategy is determined by its current payoffs. These dynamics have a rationale in both biological and cultural evolutionary modeling, and sometimes tell us things that rational choice theory does not." We agree with the first sentence in this quotation: a good theory about the behavior under norms ought to explain altruism in the Prisoner’s Dilemma (PD), playing Stag in Stag Hunt (SH), and offering equal splits in the symmetric Nash bargaining game (NB). We also agree with Harms and Skyrms about the difference in technical challenges each of these games poses. Finding a single mechanism, even one as broadly understood as evolution, that could solve these challenges en masse is no doubt a tall order. Nonetheless, in this paper, we present a single, simple, modification to SH, NB, and a general n-player PD that does just that: we introduce deontological autonomy into the models. (shrink)
Mental accounting is a concept associated with the work of Richard Thaler. According to Thaler, people think of value in relative rather than absolute terms. They derive pleasure not just from an object’s value, but also the quality of the deal – its transaction utility (Thaler, 1985). In addition, humans often fail to fully consider opportunity costs (tradeoffs) and are susceptible to the sunk cost fallacy. Why are people willing to spend more when they pay with a credit card (...) than cash (Prelec & Simester, 2001)? Why would more individuals spend $10 on a theater ticket if they had just lost a $10 bill than if they had to replace a lost ticket worth $10 (Kahneman & Tversky, 1984)? Why are people more likely to spend a small inheritance and invest a large one (Thaler, 1985)? According to the theory of mental accounting, people treat money differently, depending on factors such as the money’s origin and intended use, rather than thinking of it in terms of the “bottom line” as in formal accounting (Thaler, 1999). An important term underlying the theory is fungibility, the fact that all money is interchangable and has no labels. In mental accounting, people treat assets as less fungible than they really are. Even seasoned investors are susceptible to this bias when they view recent gains as disposable “house money” (Thaler & Johnson, 1990) that can be used in high-risk investments. In doing so, they make decisions on each mental account separately, losing out the big picture of the portfolio. (See also partitioning and pain of paying for ideas related to mental accounting.) . (shrink)
Over the past two decades, gamblers have begun taking mathematics into account more seriously than ever before. While probability theory is the only rigorous theory modeling the uncertainty, even though in idealized conditions, numerical probabilities are viewed not only as mere mathematical information, but also as a decision-making criterion, especially in gambling. This book presents the mathematics underlying the major games of chance and provides a precise account of the odds associated with all gaming events. It begins by explaining (...) in simple terms the meaning of the concept of probability for the layman and goes on to become an enlightening journey through the mathematics of chance, randomness and risk. It then continues with the basics of discrete probability, combinatorics and counting arguments for those interested in the supporting mathematics. These mathematic sections may be skipped by readers who do not have a minimal background in mathematics; these readers can skip directly to the Guide to Numerical Results to pick the odds and recommendations they need for the desired gaming situation. Doing so is possible due to the organization of that chapter, in which the results are listed at the end of each section, mostly in the form of tables. The chapter titled The Mathematics of Games of Chance presents these games not only as a good application field for probability theory, but also in terms of human actions where probability-based strategies can be tried to achieve favorable results. Through suggestive examples, the reader can see what are the experiments, events and probability fields in games of chance and how probability calculus works there. The main portion of this work is a collection of probability results for each type of game. Each game s section is packed with formulas and tables. Each section also contains a description of the game, a classification of the gaming events and the applicable probability calculations. The primary goal of this work is to allow the reader to quickly find the odds for a specific gaming situation, in order to improve his or her betting/gaming decisions. Every type of gaming event is tabulated in a logical, consistent and comprehensive manner. The complete methodology and complete or partial calculations are shown to teach players how to calculate probability for any situation, for every stage of the game for any game. Here, readers can find the real odds, returned by precise mathematical formulas and not by partial simulations that most software uses. Collections of odds are presented, as well as strategic recommendations based on those odds, where necessary, for each type of gaming situation. The book contains much new and original material that has not been published previously and provides great coverage of probabilities for the following games of chance: Dice, Slots, Roulette, Baccarat, Blackjack, Texas Hold em Poker, Lottery and Sport Bets. Most of games of chance are predisposed to probability-based decisions. This is why the approach is not an exclusively statistical one, but analytical: every gaming event is taken as an individual applied probability problem to solve. A special chapter defines the probability-based strategy and mathematically shows why such strategy is theoretically optimal.". (shrink)
This thesis consists of three papers examining determinants and implications of related party transactions (RPTs) in Vietnam, a transitional economy in South East Asia with features of concentrated state ownership and weak minority investor protection. Specifically, these papers describe RPTs and examine (i) the association between RPTs and state ownership, (ii) the association between the cost of corporate debt and RPTs, and the moderating role of state ownership on the association between the cost of debt and RPTs, and (iii) the (...) association between corporate tax avoidance and RPTs, and the moderating role of state ownership on this potential association. The first paper describes the nature and extent of RPTs in Vietnamese listed firms and examines the association between RPTs and state ownership. The results from this paper demonstrate that related party transactions are prevalent in Vietnam. Findings show that the presence of state ownership is related to a lower extent of RPTs. However, among firms with state ownership, the extent of RPTs is positively associated with percentage of state ownership. The second paper reveals that the cost of debt is higher in firms having a higher level of RPTs, implying that RPTs are viewed as a potential risk to firms from the point of view of lenders. However, the presence of state ownership can reduce the effect of RPTs on the cost of debt. The third paper provides evidence that firms with RPTs demonstrate more tax avoidance than their counterparts without RPTs. Further, among firms with RPTs, firms with a higher extent of related net credit and related sales are found to exhibit even higher levels of tax avoidance. However, the association between tax avoidance and RPTs is moderated by the presence of state ownership. Finally, in firms with RPTs, the presence of state ownership reduces tax avoidance measured by effective tax rates. (shrink)
This study contributes to the micro-credit literature by addressing the lack of philosophical dialogue concerning the issue of trust between micro-credit NGOs and rural poor women. The study demonstrates that one of the root causes of NGOs’ contested roles in Bangladesh is the norm that they use (i.e., trust) to rationalize their micro-credit activities. I argue that Bangladeshi micro-credit NGOs’ trust in poor village women is not genuine because they resort to group responsibility sustained through aggressive (...) surveillance. I maintain so by drawing on a trust-based theoretical framework that uses various philosophical insights. Drawing on the same conceptual framework, I also contend, somewhat softening the previous claim, that if micro-credit trust is trust at all, it is at most strategic, not generalized. For being strategic, it has many undermining effects on local social solidarity norms, rendering Bangladeshi micro-credit NGOs and strategic trust an odd couple with no moral compass. To bring forth the moral impetus in micro-credit activities, I lay out some recommendations intended for organizations, managers, and policymakers, consistent with normative corporate social responsibility initiatives. However, further studies can be initiated based on this paper, suggesting its importance for future research. (shrink)
Experimental modeling in biology involves the use of living organisms (not necessarily so-called "model organisms") in order to model or simulate biological processes. I argue here that experimental modeling is a bona fide form of scientific modeling that plays an epistemic role that is distinct from that of ordinary biological experiments. What distinguishes them from ordinary experiments is that they use what I call "in vivo representations" where one kind of causal process is used to stand in (...) for a physically different kind of process. I discuss the advantages of this approach in the context of evolutionary biology. (shrink)
The fate of optimality modeling is typically linked to that of adaptationism: the two are thought to stand or fall together (Gould and Lewontin, Proc Relig Soc Lond 205:581–598, 1979; Orzack and Sober, Am Nat 143(3):361–380, 1994). I argue here that this is mistaken. The debate over adaptationism has tended to focus on one particular use of optimality models, which I refer to here as their strong use. The strong use of an optimality model involves the claim that selection (...) is the only important influence on the evolutionary outcome in question and is thus linked to adaptationism. However, biologists seldom intend this strong use of optimality models. One common alternative that I term the weak use simply involves the claim that an optimality model accurately represents the role of selection in bringing about the outcome. This and other weaker uses of optimality models insulate the optimality approach from criticisms of adaptationism, and they account for the prominence of optimality modeling (broadly construed) in population biology. The centrality of these uses of optimality models ensures a continuing role for the optimality approach, regardless of the fate of adaptationism. (shrink)
This paper applies Causal Modeling Semantics (CMS, e.g., Galles and Pearl 1998; Pearl 2000; Halpern 2000) to the evaluation of the probability of counterfactuals with disjunctive antecedents. Standard CMS is limited to evaluating (the probability of) counterfactuals whose antecedent is a conjunction of atomic formulas. We extend this framework to disjunctive antecedents, and more generally, to any Boolean combinations of atomic formulas. Our main idea is to assign a probability to a counterfactual ( A ∨ B ) > C (...) at a causal model M by looking at the probability of C in those submodels that truthmake A ∨ B (Briggs 2012; Fine 2016, 2017). The probability of p (( A ∨ B ) > C ) is then calculated as the average of the probability of C in the truthmaking submodels, weighted by the inverse distance to the original model M. The latter is calculated on the basis of a proposal by Eva et al. (2019). Apart from solving a major problem in the research on counterfactuals, our paper shows how work in semantics, causal inference and formal epistemology can be fruitfully combined. (shrink)
In this paper, I show that Polleit and Mariano (2011) are right in concluding that Credit Default Swaps (CDS) are per se unobjectionable from Rothbard’s libertarian perspective on property rights and contract theory, but that they fail to derive this conclusion properly. I therefore outline the proper explanation. In addition, though Polleit and Mariano are correct in pointing out that speculation with CDS can conceivably hurt the borrowers’ interests, they fail to grasp that this can be the case only (...) in some peculiar circumstances that I identify. In other words, they miss the bigger picture, the one outside special circumstances, in which CDS trading has the opposite effect. That is, CDS facilitate debt accumulation, including government debt accumulation. Finally, I point out how this can precipitate the collapse of fiat money regimes. An incidental goal of the analysis is to provide a better account than Polleit and Mariano of recent government interventions in and around CDS markets. (shrink)
Many in philosophy understand truth in terms of precise semantic values, true propositions. Following Braun and Sider, I say that in this sense almost nothing we say is, literally, true. I take the stand that this account of truth nonetheless constitutes a vitally useful idealization in understanding many features of the structure of language. The Fregean problem discussed by Braun and Sider concerns issues about application of language to the world. In understanding these issues I propose an alternative modeling (...) tool summarized in the idea that inaccuracy of statements can be accommodated by their imprecision. This yields a pragmatist account of truth, but one not subject to the usual counterexamples. The account can also be viewed as an elaborated error theory. The paper addresses some prima facie objections and concludes with implications for how we address certain problems in philosophy. (shrink)
This article argues that Lara Buchak’s risk-weighted expected utility theory fails to offer a true alternative to expected utility theory. Under commonly held assumptions about dynamic choice and the framing of decision problems, rational agents are guided by their attitudes to temporally extended courses of action. If so, REU theory makes approximately the same recommendations as expected utility theory. Being more permissive about dynamic choice or framing, however, undermines the theory’s claim to capturing a steady choice disposition in the (...) face of risk. I argue that this poses a challenge to alternatives to expected utility theory more generally. (shrink)
A moderately risk averse person may turn down a 50/50 gamble that either results in her winning $200 or losing $100. Such behaviour seems rational if, for instance, the pain of losing $100 is felt more strongly than the joy of winning $200. The aim of this paper is to examine an influential argument that some have interpreted as showing that such moderate risk aversion is irrational. After presenting an axiomatic argument that I take to be the strongest (...) case for the claim that moderate risk aversion is irrational, I show that it essentially depends on an assumption that those who think that risk aversion can be rational should be skeptical of. Hence, I conclude that risk aversion need not be irrational. (shrink)
Create an account to enable off-campus access through your institution's proxy server.
Monitor this page
Be alerted of all new items appearing on this page. Choose how you want to monitor it:
Email
RSS feed
About us
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.