Quantcast
Channel: Instrument Control (GPIB, Serial, VISA, IVI) topics
Viewing all 5666 articles
Browse latest View live

Signals beetwen Labview and Nucleo l053r8

$
0
0

Hi.

I am a beginner in using Labview with mikrocontrols. I used to use myDAQ or Elvis II motherboard to create a electrical system.

I have a board Nucleo L053R8 and a Prototype board.

I want to make a control one (or more) diode LED in the prototype board. I have VI which switches on/off diode but I can't make this in the device.

I don't know how I can give signals for A0 and others. I know I should use a VISA-write but I don't have idea where should I give a data for a "write buffer".

I can't find any good manuals or tutorials for this. Can someone help me with this or give a link to guide where everything is?

 

Bellow is my example VI, how I should connect if I want give to A0 true/false?


nonlinear model predictive control (MPC)

$
0
0

MPC takes only two type of equations it should be either transfer function or than state space but my equation is non linear how can i get transfer function or state space from that in labview, so that i can apply MPC on that model...

UART LabView Dataloss and delay

$
0
0

As we are using Arduino to obtain data of the location (in coordinate system), we need to transfer the data from Arduino to Labview. We have observed that the time delay affects the performance and accuracy of the data, however, Labview is unable to read the data perfectly from Arduino. We are currently using the UART program in Labview to transfer 9 digits in the form of string data type. It is observed that sometimes the Labview is unable to read all the 9 digits and some data are lost. The problem is solved when the time delay is increased in Arduino, however, we need to maintain the accuracy so we are looking for an alternative solution. Is there any other solution available for this problem?

Having problems taking usb scanned data input into my vi along with gpib instrument data to save to file.

$
0
0

Gentlemen,

I have been trying to integrate the WT500 via gpib; taking instrument data and saving that to a spreadsheet file with headers. Because I have a different DUT each time I test I elected to be able to read the bar code into the vi (beginning of each row) below the main header.  I can't make that happen.  The scanned text comes in via usb fine, but it won't behave after that.  It appears it needs to be a string, otherwise it filters out all the alphabet characters and leaves numbers only.  That's a problem where I need both because I don't know how to combine those two in a array to be saved as shown as attached..  Further when I close this vi it won't save the headers assignments.  I have to retype them all over again.

 

Any suggestions would be appreciated.

 

Thanks,

Eric

GPIB-USB-HS

$
0
0

I have two identical interfaces , one of them works properly while the other, as soon as it is connected to the PC ,it turns on the active led (green)

but,after a couple of seconds the ready led glows amber and both the lights switch off.

After that,windows device manager shows it in wrong state , MAX does not recognize it.

It generates a 441 event ID with the following   event data

 EventData
  DeviceInstanceIdUSB\VID_3923&PID_709B\01886823
  LastDeviceInstanceIdUSB\VID_148F&PID_2573\5&165678ce&2&8
  ClassGuid{4D36E972-E325-11CE-BFC1-08002BE10318}
  LocationPathPCIROOT(0)#PCI(1D07)#USBROOT(0)#USB(8)
  MigrationRank0xf000ffffe0000033
  Presentfalse
  Status0xc0000719

Any suggestion ?

Thank you in advance

 

Sending an int value to an Arduino to operate a Digital Potentiometer

$
0
0

Hello,

I have a digital potentiometer that I have got working with some arduino code, and I am now attempting to pass an int value to the arduino code through labview in order to control the resistance with a lab view dial.  I am having trouble figuring out a way to do this, as VISA and LINX don't have a way (unless I missed something) to pass an integer.  Could I potentially do this by converting a String (which I am able to send to the arduino through labview) into an integer?

 

Thanks,

J

maximum sampling rate

$
0
0

hi, 

    I am using a Picoscope 4423 using USB and creating a virtual oscilloscope in labview. I want to know what is the maximum sampling rate i can get. As of now i am unable to get more than 1M samples per second or in other words a sample interval of 1us. Is this the limitation of hardware or labview or my programming? I am able to reach a maximum frequency of 500KHz.

 

i have attached my .vi files below

How to change GPIB device primary address

$
0
0

Hello ;

       I'm new comer to GPIB filed. We want to integrate one device into our antomatic test enviorment.

The GPIB primary address of the device is shown as 25 in NIMAX(seen as attached). We intend to change the primary address to 4. Could someone tell me how to change the primary address of this device?

 

thanks!

 

BR//jackie 

  


Controlling ADALM1000 (M1K) from Analog Devices

$
0
0

Hi everyone,

I am trying to use the ADALM1000 device from Analog Devices as a SMU, by controlling it with LabVIEW. This device has two channels from which you can chose to measure a voltage, source a voltage and measure a current, or source a current and measure a voltage.
There is a software (PixelPulse) that allows to do all of that but I need to be able to do it in LabVIEW, at least I need to be able to source a voltage and measure a current to plot IV curves. 

So I was wondering if anyone has been able or tried to do that yet?

 

Up to now, I went through the source code of PixelPulse (which is available on the github of analog devices), to understand how to control the device. Since I am not really good in C, I still don't get everything but I know what library it is using (libsmu).
I was thinking of using this library in LabVIEW using those instructions (http://forums.ni.com/t5/Example-Program-Drafts/Using-Existing-C-Code-or-a-DLL-in-LabVIEW/ta-p/3499233), but I don't know if this is a good solution and if I will be able to use it afterwards.

Thanks in advance for you opinion on the matter! 

Sharing COM PORT MOXA

$
0
0

Hello

 

I am using 3 computers which run the same VI.

They are connected on Ethernet and then, I have a MOXA (Nport 5150) connected to my RS232 device.

I configured 3 different COM PORT number depending on which computer needs to have access to the COM PORT.

COM 1 = PC 1

COM 2 = PC 2

COM 3 = PC 3

 

They are not using the COM PORT all time and never at the same time. But soon one PC has used the COM PORT, the other 2 com port on other pc are not usable (I can see it in NI MAX). And I do ask to close the VISA com port in my software...

 

So is that because I don't close the port properly ?

Or because it is not possible to share on COM PORT with a MOXA ?

 

Thank you

Agilent 82347B GPIB-USB connection problem

$
0
0

Hey

 

I'm trying to connect to a DS345 function generator with Agilent 82347B GPIB-USB.

Let me first start by saying, I've read most posts i found on this board and others regarding this topic and could not find a solution to my problem.

 

After following the installation steps (here) and settings (here) when trying to run an auto scan in connection export (ver 18.0 i've also tried older version down to 15)  the GPIB interface is recognized, but the devise connected to it is not recognized (I've tried to use SRS DS345 and SRS SR865). If I go to NI MAX (ver 17) I can see the interface but again no instrument (of course i verified the GPIB address in my device), if I manually add the devise in connection expert only then can i see the instrument both in connection expert and ni max. But I cant even sent *IND? command both in NI max (Hex 0xBFFF0015) and connection expert (0xBFFF0015). Also if i try to use NI Max VISA interactive control I get an error message ( 0xBFFF003A) see attached files for all the errors I get.

 

In the past I had no problem connecting to this device with the same setup (older versions of NI MAX and connection expert), since, I've installed an optical design program called OSLO by Lambda Research and in order to install it's security usb dongle I had to follow these instructions to activate it. The first thing I tried to fix this possible conflict was to remove the logos.ini  file i've created but no luck there.

 

I'm not sure if the last part have any connection to the problem, but i'm really lost here.

Appreciate any help!  

  

Assigning parameters for write_ascii_values in pyvisa for serial port

$
0
0

Hi all,

 

I want to control my Keithley 6485 Picoammeter externally by connecting it via RS232toUSB to my Linux PC (CentOS 6.9) and writing a code in python (version 2.7.13) with pyvisa:

 

import sys
import visa
from visa import constants

rm = visa.ResourceManager('/usr/local/vxipnp/linux/lib64/libvisa.so')#open serial connection and set baud to 9600,8 data bits, CR termination, one stop bit, none parity, no flow control
amm = rm.open_resource('ASRL2::INSTR', baud_rate =9600, data_bits =8, write_termination='\r', read_termination ='\r')
constants.VI_ASRL_STOP_ONE     
constants.VI_ASRL_PAR_NONE
constants.VI_ASRL_FLOW_NONE                  
amm.write("*RST")# Return 6485 to RST default
amm.write("SYS:ERR:ALL?")# Return error message 
amm.write("TRIG:DEL 0")# Set trigger delay to zero seconds
amm.write("TRIG:COUNT 2500")# Set trigger count to 2500
amm.write("SENS:CURR:RANG:AUTO OFF")# Turn auto range off
amm.write("SENS:CURR:NPLC .01")# Set integration rate to NPLC 0.01
amm.write("SENS:CURR:RANG 2e-7")# Use 200 nA range 
amm.write("SYST:ZCH OFF")# Turn zero check off
amm.write("SYST:AZER:STAT OFF")# Turn auto zero off
amm.write("DISP:ENAB OFF")# Turn Display off
amm.write("*CLS")# Clear status model
amm.write("TRAC:POIN 2500")# Set buffer size to 2500
amm.write("TRAC:CLE")# Clear buffer
amm.write("TRAC:FEED:CONT NEXT")# Set storage control to start on next reading
amm.write("STAT:MEAS:ENAB 512")# Enable buffer full measurement event
amm.write("*SRE 1")# Enable SRQ on buffer full measurement event
amm.write("*OPC?")# operation complete query (synchronize completion of commands)
amm.write("INIT")# start taking and storing readings wait for GPIB SRQ line to go true
amm.write("DISP:ENAB ON")# Turn display on
print(amm.query_ascii_values("TRAC:DATA?"))# Request data from buffer

The problem when I run this script I just get "1" as the print output, although it should be returned in ASCII like this: Reading, Timestamp, Status and the error message after amm.write("*RST"): -113 undefined header. So I think the messages and not transferred correctly.

I know over the RS-232 interface, only the ASCII format is allowed. But when I follow the example in the pyvisa instruction with write_ascii_values(text, values) and assigning it a list, I only get an error message from the device -100 Command error.

Can somebody please tell me how to set the variables in write_ascii_values correctly or what I am doing wrong? Are my settings for the serial device wrong? Sometimes when I execute 2 times I get the error "VI_ERROR_ASRL_FRAMING (-1073807253): A framing error occurred during transfer ." too. I just do not know what to do.

Thank you!

Regards, Roland

Delay in output signal (unwanted)

$
0
0

Hello,

I'm new to labview.
I'm trying to make a program that will read an input, and if the input is 1 then it will output a sinusoidal wave or if it is zero it will output zero.

I am successfully reading the input, and  can output the sine wave, however when I am reading the signal on the oscilloscope, there is a 5-10 ms 'delay' once every second and if I change what the input signal is, it will only update after 1s. I have attached pictures of my program and the output.

Is there a way to output the signal continuously (without the delay)?

Is it possible to have the wave update in real time?

Thank you

 

 

Multiple barcode scanner RS-232

$
0
0

Hello All,

I will be using 5 RS-232 barcode scanner with RS-232 to USB converters.

I want to read the barcodes/string only when the operator triggers the barcode scanner, i.e. dont want to read the port all the time.

how can I do it ??

 

Thanks

data adquisition

$
0
0

I have a "GCTS" data acquisition system and I need to read data of strain, pressure and deformation, the problem is that my daqassistant does not recognize the device, What is the best method to perform a data acquisition?, i use labview 7.1


Controlling one intrustment with two applications using VBA

$
0
0

Hi everyone,

 

I am writing two programs to control a signal generator using VBA

The problem is these two programs can't run parallelly, sometimes it got stuck

Even if i use two PCs, it still happen

 

How i can solve the problem?

 

Thank you and best regards

Programming Configuration (.ini) File to Call Specific RS-232 Hexadecimal Commands during Simulation

$
0
0

Hello, 

 

I have two separate VI's, one for sending and receiving Hexadecimal commands using text files over RS-232 comms and the other VI uses a Configuration (.ini) file for manipulating compact DAQ channels, (Analogue Outputs/Digital Output modules).

 

What I want to achieve is for both VI's to operate as one program, whereby as the channels are set and active in hardware, can be programmed to fire a specific RS-232 command (equivalent to a sub-routine) and receive a response. I want to accomplish this so that I can track the parameters set in the motor controller and make sure that everything is 'bug' free.

 

I think my main concern is whether both these VI's compatible with each other? At this stage I have two separate programs running with different formats, one using .txt and the other using .ini.

 

I need to be able to call a subroutine from .ini or in the VI somehow to fire a specific hexadecimal .txt command depending on what is being exercised on the cDAQ modules.

NI-Spy and NI I/O Trace: How it work?

$
0
0

Hi everyone,

 

I want to write a program that work similar to NI-Spy or NI I/O Trace using VBA

Could anyone explain me how they work? or which API should i use?

 

Thank you so much

IVI - Errors when using GetSpecifiDriverCHandle

$
0
0

Hello,

i created a C++ DLL for the IviDmm class and i am calling the Keysight 34980 ivi driver. This custom C++ Dll contains all generic functions of IviDMM and is working fine..

Now i wanted to run a specific function of the 34980A-driver. For this execute "iviDMM_GetSpecifiDriverCHandle" which returns a numeric handle without error.

 

Then i run a function of another custom C++ DLL where i call some specific functions.

For example:

viStatus = Ag34980a_SetAttributeViBoolean (Vi, Channel, AG34980A_ATTR_OUTPUT_STATE, State);

 

For the parameter "Vi" i pass the specificDriverCHandle that was created by the other DLL.

 

But the function returns:

-1074135040 = Failure cannot recover.

 

What is the error here?

Same error occurs also in Simulation mode

Thanks

 

NI MAX 16.0.0f0

ICP 16.0.1

IviSharedComponents 242
Driver setup in MAX: DMM=true, slot1=34932A, slot2=34932A, slot3=34932A, slot4=34932A, slot6=34941A, slot8=34937A

Keysight driver: 1.5.7.0

 

 

Agilent 34972 EZ examples freeze DAQ

$
0
0

I'm playing with Labview and the Agilent 34972 to get a feel for instrument control.  I am using two of the example EZ blocks from the pallette, pictured below.  My simple vi runs fine using just one EZ, either one.  However trying both together freezes the daq.  I think I am initializing incorrectly...?  Any help is appreciated.

 

agilentvi.PNG 

Viewing all 5666 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>