2017 18th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT), 2017
Background and foreground separation is the major task in video surveillance system to detect mov... more Background and foreground separation is the major task in video surveillance system to detect moving or suspicious objects. Robust Principal Component Analysis, whose formulation relies on low-rank plus sparse matrices decomposition, shows an interestingly suitable framework to separate moving objects from the background. The optimization problem is transformed to a sequence of convex programs that minimize the sum of L1-norm and nuclear norm of the two component matrices, which are efficiently resolved by an Augmented Lagrangian Multiplierss based solver. In this paper, we propose two new robust schemas for low rank approximation of numerical matrices. The proposed algorithms allow batch and incremental robust low-rank approximal of matrices used in static and real-time foreground extraction to detect moving objects. Experiments reveal that the proposed method are both deterministic, converge decently and quickly; besides, they achieve an accurate background and foreground separation outcome.
Abstract. Visceral leishmaniasis (VL) cases in children less than five years of age were recorded... more Abstract. Visceral leishmaniasis (VL) cases in children less than five years of age were recorded from 1996 through 2006 from Tunisian pediatric departments. Mean incidence rates were calculated for each of the 215 districts in the study area. Averages of annual rainfall and extreme values of low temperatures in winter and high temperatures in summer were used to characterize the climate of each district according to its continentality index and bioclimatic zone. A geo-graphic information system and a local indicator of spatial association were used to summarize the spatial properties of VL distribution. Poisson spatial regression was performed to study the relationship between VL incidence rates and climatic parameters. We identified one hot-spot region of 35 inland districts located mostly in the semi-arid bio-climatic zone and two cold-spots located in coastal regions of the northeastern sub-humid zone and the southeastern arid zone. The incidence rate of VL was positively correl...
We present a prequential (predictive-sequential) approach for testing the goodness-of-fit of an e... more We present a prequential (predictive-sequential) approach for testing the goodness-of-fit of an exponential distribution when the parameter $\lambda$ is unknown. Instead of using all the available observations, $\lambda$ is estimated by a prequential approach where at each step $i$, only the $i\!-\!1$ first observations are used. We show that this approach provides a sequence of \ks type distances whose expressions do not depend on $\lambda$ and which converge in distribution (under the null hypothesis) to the \ks distribution. This leads to a simple technique for testing the goodness-of-fit of exponential distributions with unknown parameter using standard quantile tables of the \ks distribution. Even if Monte~Carlo simulations show that the prequential test is less powerful than the standard exponentiality test, the developed results represent a first step in the theoretical study of the {\it u-plot} which is a prequential empirical tool commonly used for the validation of reliabi...
Agri-biotech multinational enterprises (MNEs) are persisting to push genetically modified plant v... more Agri-biotech multinational enterprises (MNEs) are persisting to push genetically modified plant varieties (GMV) worldwide including emerging countries as a technological solution for sustainable development. However, in emerging countries, the structure and effectiveness of regulation and compliance measures to ensure human and environmental safety are much less developed. There are three types of concerns: the economic risks faced by farmers while using existing low-yielding conventional seed varieties, in the face of inadequate institutional mechanisms and safety nets, the long-term environmental risks, and finally, risks posed by other possible externalities. In an attempt to provide some insight on the aforementioned debate, this chapter focuses on a commercially successful GMV—namely genetically modified cotton, also referred to as Bt cotton. The literature on adoption of Bt cotton is first examined, and its findings are confronted with the reality of the introduction and diffu...
This paper models and predicts how the strengthening of intellectual property (IP) protection wil... more This paper models and predicts how the strengthening of intellectual property (IP) protection will impact R&D in developing economies. International agreements such as TRIPs and free trade agreements are enhancing the level of international control on IP. This is changing deeply the R&D environment in developing economies by restraining illegal channels of knowledge accumulation such as imitation, reverse engineering and piracy. An asymmetric and non-cooperative two-stage (R&D-Production) game is proposed to model a developing market where two local firms compete with a more innovative foreign firm. Equilibrium R&D expenditures and profits of the competing firms are compared for different levels of: market technology, technological gaps and IP protection. The proposed model shows clearly that a stringent enforcement of IP agreements will dramatically decrease the innovative abilities of developing economies especially in high technological sectors. The maintain and increase of their...
Value-at-Risk (VaR) is a most widely used tool for assessing financial market risk. In practice t... more Value-at-Risk (VaR) is a most widely used tool for assessing financial market risk. In practice the estimation of market risk by VaR generally used models assuming independence of returns. However, financial returns tend to occur in clusters with time dependency, therefore in this paper we study the impact of negligence of returns dependency in market risk assessment. The main methods which take into account returns dependency to assess market risk are: Declustering, Extremal index and Time series-Extreme Value Theory combination. A comparison between VaR estimated under independency and under dependence assumptions shows an important reduction of the estimation error under dependency assumption. Results for simulated data show that Declustering and extremal index methods have generally the best performances. Extreme financial risk has an impact in allocated capital to cover extreme financial risk, an error of hypothesis induce an error in the amplitude of risk. For real data Time s...
Ce travail est consacre a l'etude de methodes statistiques pour l'evaluation de la fiabil... more Ce travail est consacre a l'etude de methodes statistiques pour l'evaluation de la fiabilite des logiciels. Son but principal est de fournir des outils statistiques permettant de construire et ensuite valider des modeles en tenant compte des specificites des logiciels etudies. Pour ce faire deux outils sont utilises : les modeles lineaires generalises (parametriques et non-parametriques) et l'analyse statistique bayesienne. La deuxieme partie de ce travail est consacree a l'etude mathematique des problemes de validation et de choix de modeles en fiabilite des Logiciels. On y etudie entre autres une approche dite "prequentielle" (predictive-sequentielle) bien adaptee aux tests d'adequation aux processus de Poisson. Cette approche semble pouvoir se generaliser a un grand nombre de modeles de fiabilite des logiciels.
We present a prequential (predictive-sequential) approach for testing the goodness-of-t of an exp... more We present a prequential (predictive-sequential) approach for testing the goodness-of-t of an exponential distribution when the parameter is unknown. Instead of using all the available observations, is estimated by a prequential approach where at each step i, only the i?1 rst observations are used. We show that this approach provides a sequence of Kolmogorov-Smirnov type distances whose expressions do not depend on and which converge in distribution (under the null hypothesis) to the Kolmogorov-Smirnov distribution. This leads to a simple technique for testing the goodness-of-t of exponential distributions with unknown parameter using standard quantile tables of the Kolmogorov-Smirnov distribution. Even if Monte Carlo simulations show that the prequential test is less powerful than the standard exponentiality test, the developed results represent a rst step in the theoretical study of the u-plot which is a prequential empirical tool commonly used for the validation of reliability-gr...
This work is mainly concerned with the use of statistical tools for the assessment of software re... more This work is mainly concerned with the use of statistical tools for the assessment of software reliability. It provides statistical techniques for the construction and the validation of models taking into account the specific properties of each software. For this, we mainly use Generalized Linear Models (parametric and non-parametric) and Bayesian methods. The final part studies the mathematical problems of validation and choice of Software Reliability models. The predictive-sequential approach is shown to give a simple way of testing the fit of Poisson process models. The obtained predictive-sequential test seems to be usable for many Software Reliability models.
Advances in Intelligent Systems and Computing, 2015
ABSTRACT This work presents an artificial order-driven market populated by heterogeneous agents c... more ABSTRACT This work presents an artificial order-driven market populated by heterogeneous agents characterized by mixed behaviors and shared information. This market is designed to reproduce immature markets stylized facts based mainly on important asymmetric information and herd behaviors. Information flows are modeled by a directed weighted network. The important information asymmetry is modeled by different assortative network behaviors. Our experimental findings show that the artificial market developed here was able to reproduce the main features characterizing immature stock markets.
International Journal of Reliability, Quality and Safety Engineering, 2000
When extreme quantiles have to be estimated from a given data set, the classical parametric appro... more When extreme quantiles have to be estimated from a given data set, the classical parametric approach can lead to very poor estimations. This has led to the introduction of specific methods for estimating extreme quantiles (MEEQ's) in a nonparametric spirit, e.g., Pickands excess method, methods based on Hill's estimate of the Pareto index, exponential tail (ET) and quadratic tail (QT) methods. However, no practical technique for assessing and comparing these MEEQ's when they are to be used on a given data set is available. This paper is a first attempt to provide such techniques. We first compare the estimations given by the main MEEQ's on several simulated data sets. Then we suggest goodness-of-fit (Gof) tests to assess the MEEQ's by measuring the quality of their underlying approximations. It is shown that Gof techniques bring very relevant tools to assess and compare ET and excess methods. Other empirical criterions for comparing MEEQ's are also proposed a...
2017 18th International Conference on Parallel and Distributed Computing, Applications and Technologies (PDCAT), 2017
Background and foreground separation is the major task in video surveillance system to detect mov... more Background and foreground separation is the major task in video surveillance system to detect moving or suspicious objects. Robust Principal Component Analysis, whose formulation relies on low-rank plus sparse matrices decomposition, shows an interestingly suitable framework to separate moving objects from the background. The optimization problem is transformed to a sequence of convex programs that minimize the sum of L1-norm and nuclear norm of the two component matrices, which are efficiently resolved by an Augmented Lagrangian Multiplierss based solver. In this paper, we propose two new robust schemas for low rank approximation of numerical matrices. The proposed algorithms allow batch and incremental robust low-rank approximal of matrices used in static and real-time foreground extraction to detect moving objects. Experiments reveal that the proposed method are both deterministic, converge decently and quickly; besides, they achieve an accurate background and foreground separation outcome.
Abstract. Visceral leishmaniasis (VL) cases in children less than five years of age were recorded... more Abstract. Visceral leishmaniasis (VL) cases in children less than five years of age were recorded from 1996 through 2006 from Tunisian pediatric departments. Mean incidence rates were calculated for each of the 215 districts in the study area. Averages of annual rainfall and extreme values of low temperatures in winter and high temperatures in summer were used to characterize the climate of each district according to its continentality index and bioclimatic zone. A geo-graphic information system and a local indicator of spatial association were used to summarize the spatial properties of VL distribution. Poisson spatial regression was performed to study the relationship between VL incidence rates and climatic parameters. We identified one hot-spot region of 35 inland districts located mostly in the semi-arid bio-climatic zone and two cold-spots located in coastal regions of the northeastern sub-humid zone and the southeastern arid zone. The incidence rate of VL was positively correl...
We present a prequential (predictive-sequential) approach for testing the goodness-of-fit of an e... more We present a prequential (predictive-sequential) approach for testing the goodness-of-fit of an exponential distribution when the parameter $\lambda$ is unknown. Instead of using all the available observations, $\lambda$ is estimated by a prequential approach where at each step $i$, only the $i\!-\!1$ first observations are used. We show that this approach provides a sequence of \ks type distances whose expressions do not depend on $\lambda$ and which converge in distribution (under the null hypothesis) to the \ks distribution. This leads to a simple technique for testing the goodness-of-fit of exponential distributions with unknown parameter using standard quantile tables of the \ks distribution. Even if Monte~Carlo simulations show that the prequential test is less powerful than the standard exponentiality test, the developed results represent a first step in the theoretical study of the {\it u-plot} which is a prequential empirical tool commonly used for the validation of reliabi...
Agri-biotech multinational enterprises (MNEs) are persisting to push genetically modified plant v... more Agri-biotech multinational enterprises (MNEs) are persisting to push genetically modified plant varieties (GMV) worldwide including emerging countries as a technological solution for sustainable development. However, in emerging countries, the structure and effectiveness of regulation and compliance measures to ensure human and environmental safety are much less developed. There are three types of concerns: the economic risks faced by farmers while using existing low-yielding conventional seed varieties, in the face of inadequate institutional mechanisms and safety nets, the long-term environmental risks, and finally, risks posed by other possible externalities. In an attempt to provide some insight on the aforementioned debate, this chapter focuses on a commercially successful GMV—namely genetically modified cotton, also referred to as Bt cotton. The literature on adoption of Bt cotton is first examined, and its findings are confronted with the reality of the introduction and diffu...
This paper models and predicts how the strengthening of intellectual property (IP) protection wil... more This paper models and predicts how the strengthening of intellectual property (IP) protection will impact R&D in developing economies. International agreements such as TRIPs and free trade agreements are enhancing the level of international control on IP. This is changing deeply the R&D environment in developing economies by restraining illegal channels of knowledge accumulation such as imitation, reverse engineering and piracy. An asymmetric and non-cooperative two-stage (R&D-Production) game is proposed to model a developing market where two local firms compete with a more innovative foreign firm. Equilibrium R&D expenditures and profits of the competing firms are compared for different levels of: market technology, technological gaps and IP protection. The proposed model shows clearly that a stringent enforcement of IP agreements will dramatically decrease the innovative abilities of developing economies especially in high technological sectors. The maintain and increase of their...
Value-at-Risk (VaR) is a most widely used tool for assessing financial market risk. In practice t... more Value-at-Risk (VaR) is a most widely used tool for assessing financial market risk. In practice the estimation of market risk by VaR generally used models assuming independence of returns. However, financial returns tend to occur in clusters with time dependency, therefore in this paper we study the impact of negligence of returns dependency in market risk assessment. The main methods which take into account returns dependency to assess market risk are: Declustering, Extremal index and Time series-Extreme Value Theory combination. A comparison between VaR estimated under independency and under dependence assumptions shows an important reduction of the estimation error under dependency assumption. Results for simulated data show that Declustering and extremal index methods have generally the best performances. Extreme financial risk has an impact in allocated capital to cover extreme financial risk, an error of hypothesis induce an error in the amplitude of risk. For real data Time s...
Ce travail est consacre a l'etude de methodes statistiques pour l'evaluation de la fiabil... more Ce travail est consacre a l'etude de methodes statistiques pour l'evaluation de la fiabilite des logiciels. Son but principal est de fournir des outils statistiques permettant de construire et ensuite valider des modeles en tenant compte des specificites des logiciels etudies. Pour ce faire deux outils sont utilises : les modeles lineaires generalises (parametriques et non-parametriques) et l'analyse statistique bayesienne. La deuxieme partie de ce travail est consacree a l'etude mathematique des problemes de validation et de choix de modeles en fiabilite des Logiciels. On y etudie entre autres une approche dite "prequentielle" (predictive-sequentielle) bien adaptee aux tests d'adequation aux processus de Poisson. Cette approche semble pouvoir se generaliser a un grand nombre de modeles de fiabilite des logiciels.
We present a prequential (predictive-sequential) approach for testing the goodness-of-t of an exp... more We present a prequential (predictive-sequential) approach for testing the goodness-of-t of an exponential distribution when the parameter is unknown. Instead of using all the available observations, is estimated by a prequential approach where at each step i, only the i?1 rst observations are used. We show that this approach provides a sequence of Kolmogorov-Smirnov type distances whose expressions do not depend on and which converge in distribution (under the null hypothesis) to the Kolmogorov-Smirnov distribution. This leads to a simple technique for testing the goodness-of-t of exponential distributions with unknown parameter using standard quantile tables of the Kolmogorov-Smirnov distribution. Even if Monte Carlo simulations show that the prequential test is less powerful than the standard exponentiality test, the developed results represent a rst step in the theoretical study of the u-plot which is a prequential empirical tool commonly used for the validation of reliability-gr...
This work is mainly concerned with the use of statistical tools for the assessment of software re... more This work is mainly concerned with the use of statistical tools for the assessment of software reliability. It provides statistical techniques for the construction and the validation of models taking into account the specific properties of each software. For this, we mainly use Generalized Linear Models (parametric and non-parametric) and Bayesian methods. The final part studies the mathematical problems of validation and choice of Software Reliability models. The predictive-sequential approach is shown to give a simple way of testing the fit of Poisson process models. The obtained predictive-sequential test seems to be usable for many Software Reliability models.
Advances in Intelligent Systems and Computing, 2015
ABSTRACT This work presents an artificial order-driven market populated by heterogeneous agents c... more ABSTRACT This work presents an artificial order-driven market populated by heterogeneous agents characterized by mixed behaviors and shared information. This market is designed to reproduce immature markets stylized facts based mainly on important asymmetric information and herd behaviors. Information flows are modeled by a directed weighted network. The important information asymmetry is modeled by different assortative network behaviors. Our experimental findings show that the artificial market developed here was able to reproduce the main features characterizing immature stock markets.
International Journal of Reliability, Quality and Safety Engineering, 2000
When extreme quantiles have to be estimated from a given data set, the classical parametric appro... more When extreme quantiles have to be estimated from a given data set, the classical parametric approach can lead to very poor estimations. This has led to the introduction of specific methods for estimating extreme quantiles (MEEQ's) in a nonparametric spirit, e.g., Pickands excess method, methods based on Hill's estimate of the Pareto index, exponential tail (ET) and quadratic tail (QT) methods. However, no practical technique for assessing and comparing these MEEQ's when they are to be used on a given data set is available. This paper is a first attempt to provide such techniques. We first compare the estimations given by the main MEEQ's on several simulated data sets. Then we suggest goodness-of-fit (Gof) tests to assess the MEEQ's by measuring the quality of their underlying approximations. It is shown that Gof techniques bring very relevant tools to assess and compare ET and excess methods. Other empirical criterions for comparing MEEQ's are also proposed a...
Uploads
Papers