Custom I/O for HBase or Hive insert

Ideas, problems or suggestions regarding the WRF software.

Custom I/O for HBase or Hive insert

Postby snhirsch » Tue May 17, 2016 5:22 pm

My team has asked me to evaluate the feasiblity of writing an I/O module capable of storing model results directly into a large Hadoop cluster (targeting Hive and/or HBase). I've downloaded all the I/O stack documentation and have started reading through. Before I spend too much time on this, can I ask a few quick questions?

- Has anyone written such a thing already?

- Can replacement I/O modules be written in C or C++, or is Fortran a requirement?

- I note that the default I/O module talks with netCDF libraries. Are these written in Fortan as well?

I will continue studying the documentation, but any input or thoughts would be welcome.
snhirsch
 
Posts: 3
Joined: Tue May 17, 2016 5:06 pm

Re: Custom I/O for HBase or Hive insert

Postby snhirsch » Thu Jun 02, 2016 9:39 am

Bump...

Is this forum active? Should I be asking this question elsewhere?
snhirsch
 
Posts: 3
Joined: Tue May 17, 2016 5:06 pm

Re: Custom I/O for HBase or Hive insert

Postby snhirsch » Fri Jul 01, 2016 10:27 am

Certainly quiet here. I'll start looking elsewhere for an active venue.
snhirsch
 
Posts: 3
Joined: Tue May 17, 2016 5:06 pm


Return to Software Engineering

Who is online

Users browsing this forum: No registered users and 3 guests