Wednesday, February 13, 2013

Stationary Time Series

#Assignment-1 Create log of the return data( way 1- log (st-st-1)/(st-1)
> #Historical volatility calculate.
> #Create acf plot for log(returns ) data and adf and interpret. NSE nifty index(from jan2012 to 31 jan 2013)
Program:
> z<-read.csv(file.choose(),header=T)
> closingprice<-z$Close
> closingprice.ts<-ts(closingprice,frequency=252)
> laggingtable<-cbind(closingprice.ts,lag(closingprice.ts,k=-1),closingprice.ts-lag(closingprice.ts,k=-1))
> Return<-(closingprice.ts-lag(closingprice.ts,k=-1))/lag(closingprice.ts,k=-1)
> Manipulate<-scale(Return)+10
> logreturn<-log(Manipulate)
> acf(logreturn)
From the figure it implies that the all the standard errors are within the 95% confidence interval and hence we can
say that the time series is stationary.
>T<-252^.5
>Historicalvolatility<-sd(Return)*T
> Historicalvolatility
[1] 0.1475815
> adf.test(logreturn)

        Augmented Dickey-Fuller Test

data:  logreturn
Dickey-Fuller = -5.656, Lag order = 6, p-value = 0.01
alternative hypothesis: stationary

Warning message:
In adf.test(logreturn) : p-value smaller than printed p-value

Since p-value is less than (1-.95) ,therefore we can say null hypothesis is rejected and hence the time series is stationary so data analysis can be done.






Thursday, February 7, 2013

Returns and Forecasting

Objective1: Find returns of NSE data>6months.having selected the 10th data as start and 95th data point as end.Also plot the assignment .

Solution:
Step 1: Read data  in the form of CSV file for the period 1/12/2011 to 5/02/2013
Command:
 z<-read.csv(file.choose(),header=T)

Step 2:Choose the Close column.
Command:
 close<-z$Close

Step 3:Vectorised the data i.e form a matrix of order 1X298 as 298 data points are available in close.
Command:
dim(close)<-c(1,298)

Step 4:Create time-series objects for close data from element (1,10 to1,95)
Command:
close.ts<-ts(close[1,10:95],deltat=1/252)
Step 5:Calculate difference between preceding and succeeding value
Command:
close.diff<-diff(close.ts)
Step 6: Calculate return :
Command:
return<-close.diff/lag(close.ts,k=-1)
final<-cbind(close.ts,close.diff,return)
Step 7: Plot
Command:
plot(return,main="Return from 10th to 95th")
plot(final,main="Data from 10th to 95, Difference, Return")

 Objective 2:1-700 data is available, Predict the data from 701-850, use the GLM estimation using LOGIT Analysis for the same

Step 1:Read data  in the form of CSV file

Command:
z<-read.csv(file.choose(),header=T)

Step 2:Check the dimension of z
Command
dim(z)


Step 3:Choose 1-700 data
Command

 new<-z[1:700,1:9]

Step 4:
Command
head(new)

Step 5:
Identify the factor and run the Logit regression
Command

 new$ed <- factor(new$ed)
 new.est<-glm(default ~ age + ed + employ + address + income, data=new, family ="binomial")
 summary(new.est)

Step 6
Prediction<-z[701:850,1:8]
 Prediction$ed<-factor(Prediction$ed)
 Prediction$prob<-predict(new.est, newdata =Prediction, type = "response")
 head(Prediction)