piątek, listopada 22, 2013

Pobieranie krwi będzie kiedyś łatwiejsze

Dzięki sprytnym okularom.

Promieniowanie kosmiczne i temperatura

Promieniowanie kosmiczne i temperatura mają wpływ na błędy w komórkach pamięci RAM i pamięci podręcznej procesora. Rzecz oczywista, teraz zmierzona.

Configure Tibco EMS provider for JBoss EAP 5.x

Put jms.jar and tibjms.jar to lib/endorsed.

Add into server/default/deploy/messaging/jms-ds.xml:
<mbean code="org.jboss.jms.jndi.JMSProviderLoader"
          name="jboss.messaging:service=JMSProviderLoader,name=EMSProvider">
      <attribute name="ProviderName">DefaultEMSProvider</attribute>
      <attribute name="ProviderAdapterClass">org.jboss.jms.jndi.JNDIProviderAdapter</attribute>
      <attribute name="FactoryRef">ConnectionFactory</attribute>
      <attribute name="QueueFactoryRef">QueueConnectionFactory</attribute>
      <attribute name="TopicFactoryRef">TopicConnectionFactory</attribute>
   <attribute name="Properties">
  java.naming.security.principal=admin
  java.naming.security.credentials=Adm1n
  java.naming.factory.initial=com.tibco.tibjms.naming.TibjmsInitialContextFactory
  java.naming.factory.url.pkgs=com.tibco.tibjms.naming
  java.naming.provider.url=tibjmsnaming://tb-dev2.dc2:7222
   </attribute>
   </mbean>
In tibemsadmin execute:
create queue queue/DLQ
create queue testQueue
commit

Write MDB:

import javax.ejb.ActivationConfigProperty;
import javax.ejb.MessageDriven;
import javax.jms.Message;
import javax.jms.MessageListener;

@MessageDriven(
activationConfig = {
 @ActivationConfigProperty(propertyName = "destination", propertyValue = "testQueue"),
 @ActivationConfigProperty(propertyName = "destinationType", propertyValue = "javax.jms.Queue"),
 @ActivationConfigProperty(propertyName = "providerAdapterJNDI", propertyValue = "java:/DefaultEMSProvider"),
 @ActivationConfigProperty(propertyName = "user", propertyValue = "user"),
 @ActivationConfigProperty(propertyName = "password", propertyValue = "pass"),
 @ActivationConfigProperty(propertyName = "DLQUser", propertyValue = "user"),
 @ActivationConfigProperty(propertyName = "DLQPassword", propertyValue = "pass")
},
mappedName = "testQueue")
public class TestQueue implements MessageListener {
    public TestQueue() {}
    public void onMessage(Message message) {
        System.out.println(message+"");    
    }
}

/** see org.jboss.resource.adapter.jms.inflow.JmsActivationSpec for all properties **/

And you've got it:

10:39:37,064 INFO [STDOUT] TextMessage={ Header={ JMSMessageID={ID:EMS-TEST.D965 28614F411790C:7} JMSDestination={Queue[testQueue]} JMSReplyTo={null} JMSDelivery Mode={PERSISTENT} JMSRedelivered={false} JMSCorrelationID={null} JMSType={null} JMSTimestamp={Fri Nov 22 10:39:36 CET 2013} JMSExpiration={0} JMSPriority={4} } Properties={ } Text={2013-11-22T10:39:37.028+01:00} }

środa, listopada 20, 2013

Standalone JBoss Messaging JMS client with EAP 5.1.0 jars

commons-logging.jar
concurrent.jar
javassist.jar
jboss-aop-client.jar
jboss-client.jar
jboss-common-core.jar
jboss-ha-client.jar
jboss-ha-legacy-client.jar
jboss-javaee.jar
jboss-logging-jdk.jar
jboss-logging-log4j.jar
jboss-logging-spi.jar
jboss-main-client.jar
jboss-messaging-client.jar
jboss-metadata.jar
jboss-mdr.jar
jboss-remoting.jar
jboss-security-spi.jar
jboss-serialization.jar
jbossall-client.jar
jbosscx-client.jar
jbossjts-integration.jar
jbossjts.jar
jnp-client.jar
log4j.jar
logkit.jar
policy.jar
scout.jar
slf4j-api.jar
slf4j-jboss-logging.jar
trove.jar

Locate class in jar file

Useful tool when you mess with jar dependencies inside Tibco.




wtorek, listopada 19, 2013

TDI can measure XML tree creation times per node

New build of TDI in profiler mode measures creation times per XML tree node. Presented time is an average with nanosecond accuracy, with given nesting level in XML tree.


środa, listopada 06, 2013

Tibco flow tuning for high throughput

The most important settings are RAM, MaxJobs/FlowLimit, Engine.ThreadCount, HTTP Response Thread Pool size (type can be set to single for savings) and JMS MaxSessions (in client ack mode).
MaxSessions and MaxJobs should be equal per process, the same is true for correlated MaxJobs and HTTP Thread Pool. Engine.ThreadCount should be 10-20% greater than sum of FlowLimits, HTTP threads. When components are stacked every following should have FlowLimit exceeding previous by at least 20% (consider BW engine restarts and backlog). You can decrease RAM usage by tweaking XML namespaces usage (prefixes defined in process namespace registry should match prefixes used in activities, to get rid of redundant namespace declarations on every node you can 'exclude prefixes' from XML roots).
Interesting case is with huge volume of large messages (~1000 per second, > 100KB) handled by different processes of various duration: resource utilization is high and every process starter is flow controlled. When flow control kicks in it closes JMS receiver, it usually has got prefetched messages inside a session and all work for fetching them is lost and repeated by other receiver. Now, the default receive time unit is 1 second, which is not enough under heavy load. So, we've got the same messages floating and stuck between server and BW process. Overall performance is degraded very much. The solution is to disable prefetch and increase JMS receiver timeout. Described case can be traced with 'show consumers full' within tibemsadmin console.