[Rd] Problem with initializiing R main loop under Windows (R.dll-version: 2.15.1)

From: Peter Vorpahl <peter.vorpahl_at_uni-potsdam.de>
Date: Tue, 21 Aug 2012 01:10:25 +0200


Dear R-developers,
even though I may not be the first one to encounter the problem described below, I would really appreciate some help or a link to a forum where this topic is being discussed. Probably I am simply misinformed on the usage of structRstart but see for youselve. I did check multiple forums and the whole internet on the topic without success.
I am using the shared library version of R under Windows and my application is intended to provide a two-way interaction with R.

The code snippet, I use to initialize and start the repl is:

     void RServer::run()
     {


// If we have no GUI frontend running, don't init the R-server
if (!connectToGui()) return;
// set the start time
R_setStartTime();
// Definel callback handler struct
structRstart RParams; R_DefParams (&RParams); RParams.R_Quiet = FALSE; // RParams.R_Interactive = TRUE; RParams.R_Verbose = FALSE; RParams.R_Slave = FALSE; RParams.RestoreAction = SA_RESTORE; RParams.SaveAction = SA_NOSAVE; RParams.rhome = get_R_HOME (); RParams.home = getRUser (); RParams.CharacterMode = LinkDLL; // RGui; RParams.ShowMessage = RShowMessage; RParams.ReadConsole = RReadConsoleWin; RParams.WriteConsoleEx = RWriteConsoleEx; RParams.WriteConsole = 0; RParams.CallBack = RDoProcessEvents; RParams.YesNoCancel = RAskYesNoCancel; RParams.Busy = RBusy;
// Install own handlers
R_SetParams(&RParams);
// flush the console input buffer
// call to WINAPI: clears all that's in stdin
FlushConsoleInputBuffer(GetStdHandle(STD_INPUT_HANDLE));
// Fake a command line
int argc = 2; char* argv[2] = { qstrdup ("--no-save"), qstrdup
("--no-restore") };

// redirect the BREAK signal to own handler
         signal(SIGBREAK, pi_onintr);

// set the arguments in R

         R_set_command_line_arguments (argc, argv);

// Here, the main initialization of R is performed
         setup_Rmainloop ();

// Init the IO buffer and the global context
         R_ReplDLLinit();

// if I use the following instead, no problem occurs on error
in evaluation:

// run_Rmainloop ();

         while ((_state== Connected || _state == Running) && R_ReplDLLdo1() > 0) {

             Pi::msleep(5); // to keep the processor cool
         };


// Usually we don't get here on q()
R_RunExitFinalizers(); Rf_KillAllDevices(); }

This, however is mostly adapted from 'writing R extensions'. Nonetheless, this way I receive an abnormal program termination whenever an error occurs in the evaluation of code returned from 'RReadConsoleWin'. I'm pretty sure that there is no problem in my code, since, if I use 'run_Rmainloop ()' instead, everyting works fine but I have no control on the exit of the program. This implies that my program would not be able to do proper clean up at end. I would really appreciate some help here. Yours sincerely,
Peter

-- 
Institute of Earth and Environmental Sciences, University of Potsdam
Karl-Liebknecht-Str. 24/25 (House 12), 14476 Potsdam, Germany
office: +49 331 977 2469
mobile: +49 173 3732867
e-mail: peter.vorpahl_at_uni-potsdam.de

______________________________________________
R-devel_at_r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel
Received on Mon 20 Aug 2012 - 23:26:58 GMT

This quarter's messages: by month, or sorted: [ by date ] [ by thread ] [ by subject ] [ by author ]

All messages

Archive maintained by Robert King, hosted by the discipline of statistics at the University of Newcastle, Australia.
Archive generated by hypermail 2.2.0, at Tue 21 Aug 2012 - 14:00:40 GMT.

Mailing list information is available at https://stat.ethz.ch/mailman/listinfo/r-devel. Please read the posting guide before posting to the list.

list of date sections of archive