Want to take your Spotfire analysis to the next level? What about adding an extra level of interactivity to it?
Maybe you also have a BusinessWorks or BusinessEvents engine somewhere feeding data to it, and you spot something that requires your intervention, or maybe you want to replay a flow for some reason. Sound like you need to have Spotfire communicate with those engines.
HTTP? Sure it works. What if you prefer JMS instead because you also have your own EMS server?
03/08/2015
[APT] Fix "some indexes failed to download" error
If you ever encounter this APT error:
some indexes failed to download (E: Some index files failed to download. They have been ignored, or old ones used instead.)
It might mean that your local APT info somehow was messed up but you can easily fix this by purging it and asking APT to update it again:
sudo rm -vf /var/lib/apt/lists/*
apt-get update
some indexes failed to download (E: Some index files failed to download. They have been ignored, or old ones used instead.)
It might mean that your local APT info somehow was messed up but you can easily fix this by purging it and asking APT to update it again:
sudo rm -vf /var/lib/apt/lists/*
apt-get update
[Oracle] Purge schema
Purging a schema in Oracle isn't a straightforward procedure. Usually it's better to DROP the schema or the USER and recreate it.
But if you do not have the permissions to do that, or have other restrictions preventing you to perform the operation, you might find this piece of SQL code useful:
SELECT 'drop '||object_type||' '||object_name||' '||DECODE(object_type,'TABLE', ' cascade constraints;', ';') FROM USER_OBJECTS
This will generate drop statements for ALL objects in the schema it's run on. Just execute it after connecting as the user whose schema you want to purge, then copy the output and run it as script.
But if you do not have the permissions to do that, or have other restrictions preventing you to perform the operation, you might find this piece of SQL code useful:
SELECT 'drop '||object_type||' '||object_name||' '||DECODE(object_type,'TABLE', ' cascade constraints;', ';') FROM USER_OBJECTS
This will generate drop statements for ALL objects in the schema it's run on. Just execute it after connecting as the user whose schema you want to purge, then copy the output and run it as script.
[Python] HTTP POST
Here is a sample piece of code on how to issue HTTP POST requests with an XML payload from Python
URI = 'https://httpbin.org/post'
PARAMETERS="<NODE>VALUE</NODE>"
from System.Net import WebRequest
from System.Text import Encoding
request = WebRequest.Create(URI)
request.ContentType = "text/xml"
request.Method = "POST"
bytes = Encoding.ASCII.GetBytes(PARAMETERS)
request.ContentLength = bytes.Length
reqStream = request.GetRequestStream()
reqStream.Write(bytes, 0, bytes.Length)
reqStream.Close()
response = request.GetResponse()
from System.IO import StreamReader
result = StreamReader(response.GetResponseStream()).ReadToEnd()
print result
[Java] untar
Now that we know how to tar in Java, let's see how to unzip using the same Apache Commons Compress library:
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.OutputStream;
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
import org.apache.commons.compress.archivers.tar.TarArchiveInputStream;
import org.apache.commons.compress.utils.IOUtils;
public void untar(String tarlocation, String tarname, String untarlocation){
//method variables
private File tarFile;
private TarArchiveInputStream tar_is;
private FileInputStream fin;
private TarArchiveEntry entry;
private File entryDestination;
private OutputStream out;
tarFile = new File(tarlocation+tarname);
fin = new FileInputStream(tarFile);
tar_is = new TarArchiveInputStream(fin);
//untar
//for each reference, untar to untarlocation, creating directories if needed
while ((entry = tar_is.getNextTarEntry()) != null) {
entryDestination = new File(untarlocation, entry.getName());
//if necessary create dir structure
entryDestination.getParentFile().mkdirs();
if (entry.isDirectory())
entryDestination.mkdirs();
else {
try{
//untar current entry
out = new FileOutputStream(entryDestination);
IOUtils.copy(tar_is, out);
}
catch(Exception e){
throw e;
}
finally{
//close streams ignoring exceptions
IOUtils.closeQuietly(out);
IOUtils.closeQuietly(fin);
IOUtils.closeQuietly(tar_is);
}
}
}
}
[Java] Tar file or folders
A simple way to tar a file or folder (with or without subdirectories) maintaining the folders structure is using the Apache Commons Compress library.
It has to be recursive so that we can handle subdirectories correctly. The resulting tarred file will untar to the same exact folder structure originally tarred. If you pass the location and tarlocation parameters with the path separator already appended, there's no need to concatenate File.separator in the code.
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.OutputStream;
import java.nio.file.Files;
import org.apache.commons.compress.archivers.ArchiveOutputStream;
import org.apache.commons.compress.archivers.ArchiveStreamFactory;
import org.apache.commons.compress.archivers.tar.TarArchiveEntry;
import org.apache.commons.compress.utils.IOUtils;
public void tar(String location, String name, String tarlocation, String tarname){
//method variables
private OutputStream out;
private ArchiveOutputStream tar_out;
private FileInputStream tmp_fis;
//out writes the final file, tar_out creates the tar archive
out = new FileOutputStream(new File(tarlocation+File.separator+tarname+".tar"));
tar_out = new ArchiveStreamFactory().createArchiveOutputStream(ArchiveStreamFactory.TAR, out);
//tar it
File f = new File(location+File.separator+name);
//first time baseDir is empty
dotar(f, "");
//close archive
tar_out.finish();
out.close();
}
//aux method for tarring
private void dotar(File myFile, String baseDir) throws Exception{
//maintain the directory structure while tarring
String entryName = baseDir+myFile.getName();
//DO NOT do a putArchiveEntry for folders as it is not needed
//if it's a directory then list and tar the contents. Uses recursion for nested directories
if(myFile.isDirectory() == true){
File[] filesList = myFile.listFiles();
if(filesList != null){
for (File file : filesList) {
dotar(file, entryName+File.separator);
}
}
}
else{
//add file
tmp_fis = new FileInputStream(myFile);
try{
tar_out.putArchiveEntry(new TarArchiveEntry(myFile, entryName));
IOUtils.copy(tmp_fis, tar_out);
tar_out.closeArchiveEntry();
}
catch(Exception e){
throw e;
}
finally{
if(tmp_fis != null) tmp_fis.close();
}
}
}
It has to be recursive so that we can handle subdirectories correctly. The resulting tarred file will untar to the same exact folder structure originally tarred. If you pass the location and tarlocation parameters with the path separator already appended, there's no need to concatenate File.separator in the code.
Subscribe to:
Posts (Atom)